Performance is the elusive butterfly of API development. Everybody is intrigued with its beauty, yet few know how to capture it.
In the old days, the approach of many shops to ensure a performant API was to create some code and then pass it over to the wall to QA to do load testing. Later some integration testing took place. As long as the API worked and it was met some marginal performance benchmarks, things were good.
This worked well when a public, HTTP based API, consumed by a wide variety of distributed devices was more the exception than the rule. However, today APIs are a big deal a...
Microsoft pulls a fast one! Good showing Microsoft.
Project Natick is Microsoft's R&D feasibility project to explore, manufacture and operate a underwater.
Hey, you don't look for cooling water, if you can take the salt out.
How is latency improved? drop the datacenter at the nearest ocean or lake.
Energy efficiency is no brainer considering the environment the datacenter is in.
Bring in 3D manufacturing and you can have a datacenter manufactured and deployed in no time at all, no need for expensive land acquisition, licenses, certificates etc.
I sat down with Michael Rösch, COO of POOL4TOOL, to chat about cloud computing. With a lot of buzz about the impact of the cloud on business, it was a chance to get a perspective, as well as a few hints and tips, from someone who has been at the coalface of procurement cloud services for the past 15 years. Michael has been at POOL4TOOL since 2000, becoming COO in 2012, and has worked on projects with German giants like Behr, Hansgrohe, Heidelberger Printing Presses, Carl Zeiss and ThyssenKrupp Presta in that time.
In the middle of World War II, very basic and primitive computers were designed to improve accuracy for naval gunfire. The first computers ran complex mathematical applications to calculate trajectories and gunfire from large battleships. The size of the computer was huge and was made up of vacuum-tube technology. You could literally walk into the computer. (And needed to, when a tube went bad and you had to replace it.)
My colleague, Jennelle Crothers, and I got a chance to chat about the usability and ease of use in this episode of Windows 10 series in TechNet Radio. She’s great and we had much fun doing it. Jennelle did a quick demo on how to make Windows 10 even easier to use.
Enterprises across all industries are in the midst of difficult, bet-the-company changes we call digital transformation. And yet, there remains broad confusion about the scope of the word digital.
While the rise of mobile technologies are the impetus for many such transformations, digital goes well beyond the choice of user device. In fact, digital transformation reflects the fact that customer preferences and behavior are driving enterprise technology decisions – and such technology decisions impact the entire organization, from innovative, customer-focused efforts all the way to the traditi...
The arrival of the Microsoft Azure Stack Technical Preview marks a turning point in the cloud-computing market and forms a leading indicator of how dramatically Microsoft has changed in the past two years.
The cloud turning point comes because the path to hybrid-cloud capabilities and benefits has a powerful new usher, one with the enterprise, developer, and service-provider presence, R and D budget, and competitive imperative to succeed in a market still underserved and nebulous.
Over the past year, Microsoft has been introducing Azure Resource Manager (ARM) as the preferred way to provision and manage resources in its Azure cloud. ARM is the successor to the original Service Management model, also known as “Classic.” While Classic will continue to be supported for the foreseeable future, ARM is the preferred deployment model and all new Azure features are being released on ARM only. Here is an overview ARM and the new features it provides.
The Federal Government’s “Cloud First” policy mandates that agencies take full advantage of cloud computing benefits to maximize capacity utilization, improve IT flexibility and responsiveness, and minimize cost. The Federal Risk and Authorization Management Program (FedRAMP) is a mandatory government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services. Advantages for business include being able to market to many federal agencies after a single FedRAMP review following the government’s “approve once...
Once again, the boardroom is in a bitter battle over what edict its members will now levy on their hapless IT organization. On one hand, hybrid cloud is all the rage. Adopting this option promises all the cost savings of public cloud with the security and comfort of private cloud. This environment would not only check the box for meeting the cloud computing mandate, but also position the organization as innovative and industry-leading. Why wouldn’t a forward-leaning management team go all in with cloud?
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right provider and with the proper expectations.
In his session at 18th Cloud Expo, Warner Chaves, Princ...
Digital transformation has increased the speed at which organizations must adapt. As they do so, it’s more important than ever to be able to choose solutions that will give them a comprehensive, real-time view of the network. Several factors contribute to this new priority:
Root causes and threats must be quickly identified, so network and security ops personnel must have the ability to view and share real-time data from multiple network environments.
The battle over bimodal IT is heating up. Now that there’s a reasonably broad consensus that Gartner’s advice about bimodal IT is deeply flawed – consensus everywhere except perhaps at Gartner – various ideas are springing up to fill the void.
The bimodal problem, of course, is well understood. ‘Traditional’ or ‘slow’ IT uses hidebound, laborious processes that would only get in the way of ‘fast’ or ‘agile’ digital efforts. The result: incoherent IT strategies and shadow IT struggles that lead to dispersed, redundant, and risky technology choices across the organization.
The battle, however,...
The Net Neutrality fight has been all over the news this year with the latest installment on Net Neutrality coming in from T-Mobile. Private and public companies alike are tuned into to this continual saga to see how the eventual outcome will affect business and ultimately their online lives. But whether the FCC’s recent ruling on Net Neutrality stands the test of time or not, there is a new technology that will provide customers fast, reliable Internet services at much lower costs than private business lines.
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts?
Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favorite to get your favorite pieces featured in our weekly recap.
Join Us at the Javits Convention Center in New York, New York, June 7-9
Cloud computing software is eating the world, and each day is bringing new developments in this world.
Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and Infrastructure as a Service.
Big Data | Analytics and the emerging Internet of Things (IoT) are driving exponentially increased demands on datacenters and developers alike, as we cross the zettabyte horizon this year.
Containers and microservices are now part of every PaaS conversation, and IaaS providers are increasingly competing for platform customers.
WebRTC continues to reform web communications, and DevOps is pushing its way into an enterprise IT world that is increasingly agile, lean, and continuous.
Through all this, Cloud Expo remains the single independent event where delegates and technology vendors can meet to experience and discuss the entire world of the cloud.
Only Cloud Expo brings together all this in a single location:
• Cloud Computing
• Big Data | Analytics
• Internet of Things
• Containers | Microservices
Cloud computing budgets worldwide are reaching into the hundreds of billions of dollars, and no organization can survive long without some sort of cloud migration strategy. Each month brings new announcements, use cases, and success stories.
Cloud Expo offers the world's most comprehensive selection of technical and strategic Industry Keynotes, General Sessions, Breakout Sessions, and signature Power Panels. The exhibition floor features 100+ exhibitors offering specific solutions and comprehensive strategies.
The floor also features a Demo Theater that give delegates the opportunity to get even closer to the technology they want to see and the people who offer it.
Attend Cloud Expo. Create your own custom experience. Learn the latest from the world's best technologists. Talk to the vendors you are considering, and put them to the test.