Intel Analyst Summit
Published:
Content Copyright © 2014 Bloor. All Rights Reserved.
Also posted on: The Norfolk Punt
I’ve been following Intel Software for some time (see here). It has some nice stories to tell, around, for example, instrumenting a baby’s sleep suit. Apparently, babies sometimes produce noxious semi-fluids in an autonomous fashion; and a sensor can detect this, an Intel processor can decide what to do about it (personally, throwing the baby out with the bath water seems like a good option, to me) and it can broadcast to all who might need to know about this event. Very “Internet of Things”.
Recently, however, I’ve been to a ‘proper’ Intel Analyst Summit, in London, in order to get an insight into what is going on at Intel generally. Well, it seems that things are healthier than they once were, although the recent downturn seems to have been a bit of a wake-up call for Intel. The way that PC sales used to track the economic good-health figures appears to be broken now since PC sales are falling off the graph, However, Intel has noticed that if you combine the sales for PCs and tablets, the line behaves as it used to and and this suggests that that there is still plenty of space for Intel to sell into—especially as it now sees the need to support Android. Good news is that support for all appropriate operating systems, including Android, will ship with Intel’s new products from day one.
The interesting news is that Intel appears to be reinventing itself. Rather than concentrating on technology and expecting great user experiences to follow from great technology, it is now concentrating on great user experiences and using its technology to deliver them. Which makes sense to me, although it rather implies that Intel will become a software (or, to use a sometimes overused word, solutions) company, that happens to design and make excellent chips (and which can make money from using its chip fabrication facilities to produce chips for third parties). Which changes in point-of-view may well expose some cultural issues inside Intel for it to address…
One problem, possibly, is that Intel itself is is still a bit Intel-centric. It has some really great ideas, such as XDK, its HTML5 cross-platform development platform and more; and its ideas of “thinking as newcomer not an incumbent” and focussing on what the customer actually wants rather than on what Intel engineers think it should want, are spot on. What it is perhaps missing (at a fundamental, cultural, level; I’m sure it recognises it at a superficial level), is that one result of cloud and virtualisation is that no-one really cares what chip is inside their device.
So, Intel may think that it has caught up with ARM in the ‘bang per buck’ device space (and that ARM doesn’t have Intel’s credibility in the server space) but, increasingly, people just care about having the very latest Samsung smartphone rather than about what powers it, and this attitude is migrating to the enterprise space. Competition, in many areas, is increasingly in terms of SLAs and useful user features, not in terms of processor performance, which is assumed to be generally ‘fit for purpose’ these days. This means that any messages around “computing is best with Intel” (which is what we were told now underlies Intel strategy) may be hard to sell to the sort of millennial customer (one who has grown up with the Internet), which Intel recognises that it needs.
Of particular interest to me is Intel’s initiatives around higher-level cloud standards—typified by its sponsorship (along with AT&T, Cisco, GE and IBM) of the Industrial Internet Consortium (IIC). In order to be effective, these days, such a consortium has not only to be independent of vendors and their ‘special interests’ but seen to be so—the IIC is set up as a group under the Object Management Group, which is a good start, although only the starting point.
The IIC is “the not-for-profit organization that catalyses, coordinates, and manages the collaborative efforts of industry, academia, and the government to accelerate growth of the Industrial Internet”. So, it should be interested in things like cross-vendor specifications for service levels (if two services claim ‘five nines’ availability, for example can one include an “except for planned downtime” cop-out clause, for example?).
The IIC might have some valuable potential synergies with a couple of other Intel initiatives: its “Powered by Intel Cloud Technology” badge, announced in Jan 2014, and the Intel Data Centre Manager Service Assurance Administrator. The badge program is about exposing the technology behind the cloud service, so that you can have an informed confidence in its advertised SLAs being met and match different workloads to different cloud services in a sophisticated way. The Service Assurance Administrator, designed primarily to facilitate the deployment workloads quickly, using software defined infrastructure with established service levels, supplies, or could/will supply, the metrics needed for real assurance of the Badge program—and could provide at least some of the metrics needed to drive the IIC collaboration initiatives with real interoperability testing of SLAs etc. And, of course, the badge can be used drive up Intel’s branding in the cloud space—shades of the very successful, if somewhat spurious, “Intel Inside” campaign—did it really matter much if your x86 architecture PC ran Intel or AMD inside, as long as it delivered the right spec?
And therein, I think, lies a problem. What matters, when buying a cloud service, is defining and managing the SLAs involved, in conjunction with other factors (such as the business viability of the cloud services provider) and the cost of the SLA. Get this right, and it can’t matter much what the cloud runs on, I think. Get this wrong, and a cloud running on the best Intel technology—or on a System z mainframe—can be very ‘unreliable’ in practice. I once lost access to my Gmail service because a robot somewhere decided (erroneously, and without an appeal process) that I was abusing my Gmail storage SLA—which, in my book, made Gmail just as unreliable as if it ran on cheap servers that caught fire periodically and lost my email in the process (that was long ago, and Google doesn’t seem to behave like that anymore). Any form of “Intel technology Inside” doesn’t make much sense in a cloud culture.
What could make sense—and even build Intel’s brand—is an open-standards based, cross-platform “assured by Intel” badge. If a body like the IIC can get industry-wide acceptance of a process around defining workloads, defining service levels, and matching one with the other in the context of standardised SLA definitions, then this would be very encouraging to businesses venturing into cloud. And if Intel supplied technology to measure and assure adherence to all this, using big data originating in the hardware, well, it might face competition, but I have every confidence in Intel’s ability to build the tools needed well. And, if Intel chips were instrumented and designed in such a way as to make determining assurance particularly cheap and easy, it might well sell lots of chips off the back of this too.
What doesn’t make sense, of course, is defining standards too early in the emergence (or re-emergence—some form of cloud was around in the mainframe world 40 years ago) of a disruptive innovation like cloud. However, I think this is a bigger issue at the lower, technology, level than it is at the higher, business, level—a range of rapidly evolving technologies could easily support a common definition ‘five nines availability’ in an SLA—and that definition wouldn’t change much even if cloud technology did. What also doesn’t make sense is a proliferation of standards—there seems to be a common delusion that “standards are good, so 10 standards for something are 10 times better than one standard”. If I was involved in IIC, in the OMG, I’d be taking a look at Cloud Commons, and its Service Measurement Index in particular, and wondering if they overlapped and whether they could be consolidated.
So, a rather longer blog about the Intel Analyst summit than I intended; and I didn’t get to cover all the interesting Intel technologies we heard about. Trust me, Intel does know about designing chips, implementing high performance computing and providing compilers, and associated tools, and (most important) thread visualisation tools, that will help your programmers take advantage of today’s highly-parallel computer architectures. It’s just that Intel re-inventing itself and its inputs to evolving cloud standards might be just as important to the general reader.