SNIA Academy’s presentations hit the storage target
It was good to see
the Storage Network Industry Association (SNIA) Academy finally make footfall
in the UK, with its all-day
event last Tuesday in London.
The Academy has
been staged in locations around Europe since
2005, and the emphasis is on education, especially about the latest storage
technology, and the trends, challenges and issues. The material throughout the
day was high quality and hit the major topical themes those involved with IT
storage would want to hear more about. So well done SNIA.
I can only skim
the surface here and it is only my take anyway. So my apologies to those I fail
to mention (or even hear, as there were break-out sessions running in parallel
and I could not attend everything).
Jon Tate, SNIA
Europe UK committee chair, opened the proceedings by saying the Association had
been in transition since 2004–5, emphasising not just storage but also
information—an obvious move since all stored data is information and, increasingly, decisions on storage need
to be made based on what the actual information consists of.
Then followed a
very sobering presentation from Ann La France, worldwide legal counsel for
Squire, Saunders and Dempsey, who knows a thing or three about the thorny
subjects of compliance, and data protection versus freedom of information. One
of her themes was data retention (backed by the EU data retention directive of
2006). She pointed out that the best solution to protect against data breaches
was to delete the data after the minimum
time period. The average cost of a breach, with the loss of unencrypted data was
estimated at £45–£70 per record—just in lost business and administration—while the loss of consumer confidence was unquantifiable, she said.
The two opposing
forces pulling against one another were privacy encouraging an early purge and
governments wanting to access personal data because of national security
concerns—with the UK
government currently in breach of EU directives in this! In the middle sits
regulatory compliance requiring certain information but also demanding security
including deletion. Frankly, nobody much is deleting anything at present, and La France thought many were ignoring
the problem hoping it would go away. Meanwhile the storage mountain grows.
It might have been
useful to put La France
in a panel debate alongside Nick
Baker of Sun Microsystems, whose theme was ‘best practices for long-term retention of digital information’. He
qualified this title with the word ‘preservation’—pointing out a major
problem of retrieving long-held data. SNIA had carried out a 100 year archive
survey. Frighteningly, 68% of the companies contacted have data they say needs
retaining over 100 years rising to 83% over 50 years. 53% even said they had
data needed in perpetuity.
In some cases this
longevity stemmed from requests by government. So shouldn’t governments defray
the costs? (Oh, that means the tax-payer pays; perhaps I should retract
that.) Preservation, said Baker, was a
bigger problem for semi-structured or structured data; for instance, Oracle
objects and tables relate to each other so metadata is needed to describe the
information stored to make it genuinely discoverable.
Apart from a
regular technical refresh involving migration to latest software versions there
was the matter of physical and logical migration as formats became out of date.
Baker emphasised that logical and physical should not be mixed—and, he said, only
some 30% were doing this correctly on disk while nobody was for tape or
optical. In other words this was: “record to tape and lose.”
SNIA’s answer was a ‘holistic approach’, not stove-piped with silos of
uncorrected information, which required an understanding of what an object was
in every case. The metadata format had to be correct and an audit trail
maintained from the original object with an archive object versionary needed.
He also put in a plug for SNIA’s XAM emerging standard for metadata (which
I have previously covered and believe has longer-term potential).
Despite this, it
all sounds expensive and time-consuming to me. Worse, said Baker, it was at the
bottom of the IT hierarchy so lacked adequate funding, therefore should be pushed back
to business as a serious risk.
Among some of the
other main items was John Rollason (SNIA UK
committee and NetApp) covering every aspect of storage virtualisation and how
to use it effectively and Bill Bolton (SNIA UK and Brocade) giving us just
about all we should ever need to know about Fibre Channel, its history and
clear road ahead. Mark Galpin of Quantum’s overview of de-duplication
technologies highlighted major differences in de-dupe approaches, while Steve
Collins of Pillar Data Systems covered various current trends in data
protection and restoration technologies, not least CDP.
The final
presentation of the day, by Sol Squire (SNIA Europe Nordic Committee and Data
Islandia) on building a green data centre, was full of practical tips for data
centre managers, overwhelmed by their challenges. Not the least of these was
spending a little money on data centre sensors so as to plot the power flow in
the data centre. “60% of cooling is wasted; measure what you have,” he said.
Illustrating the
point he told of data centre managers identifying the flows then strategically
placing a shower curtain to save 40% of the cooling bill at a stroke! On a
similar theme of heat output versus cooling, he said (perhaps to the
consternation of some company security managers), “You can open a window in the data centre.” (The ultimate alternative of
building a new data centre when resources run out has an average cost £20m.)
Squire also advocated
investigating renewable energy. (Iceland, where he is based, runs on
100% renewable energy, but only has 300,000 population.) He also recommended having
small realisable goals as little things had greater effect down the line. Then,
he said, “hopefully our grandchildren will still look up and see a blue sky.”
Finally, couple of
points from two vendor-specific break-out sessions I attended, are worthy of a
mention.
Trevor Kelly, EMEA
systems engineering manager for 3PAR, was discussing thin provisioning. In the
course of this he cited a recent Glasshouse Technolies’ survey of 350 host
systems in 12 large companies – which showed storage utilisation still below 30%. Frankly, with the
virtualisation and other technologies now available and the green impact moving
up the agenda, this is now an unacceptable waste of resources.
Meanwhile, Rick
Terry of IBM provided interesting – nay, alarming – slides about how disk areal
density improvements which had for decades kept pace with Moore’s Law for
computer chips – were now tailing off. So, he predicted a disk price crunch as
it was going to be more difficult to get larger capacities – and, with the huge
data capacities now needed, small error rates extrapolated to more frequent
failures. So, he said, “A 1PB (petabyte) drive fails every 10 days.”
Maybe there’s an overall
message on the day: Try and tackle the storage mountain itself and do some
serious data deletion. That way, all the other issues and concerns will reduce
in size and cost.