SYS-CON.tv is unique multimedia resource - enabled by Flash video - bringing you timely interviews, news, expert panels, and features on all that's new and all that's best among i-Technology products and services.
Generally speaking, there are two kinds of companies in the world: data rich and data poor. The richest of the data rich are easy to name: Google, Facebook, Amazon, Apple. But you don’t need to be at the top of this list to use data to create value. You need to have the tools in place to turn information (data) into action. That’s what the data rich do that the data poor and the data middle class do not.
This is part one of a two-part series of posts about using some common server storage I/O benchmark tools and workload scripts. View part II here which includes the workload scripts and where to view sample results.
There are various tools and workloads for server I/O benchmark testing, validation and exercising different storage devices (or systems and appliances) such as Non-Volatile Memory (NVM) flash Solid State Devices (SSDs) or Hard Disk Drives (HDD) among others.
Microsoft took another step toward being king of the cloud hill when it announced in January that it was releasing its Azure stack to the public. There are many technical reasons why this is cool, but more importantly, it's the psychological advantage this gives Microsoft.
Google has always had the ability for developers using its stack to develop locally on the same tools that run in Google App Engine. It recently forked its environments, so now the local and cloud environments are slightly different for some of the configurations -- I can't tell you how many nights I have lost sleep becaus...
Tangerine Bank in Toronto has improved end-user experiences for online banking and made data more secure across its lifecycle, speeding the delivery of a new credit card offering.
We'll now learn how improving end-user experiences for online banking and making data more secure across its lifecycle has helped speed the delivery of a new credit card offering.
IT planning is an imprecise science that allows IT experts to increase the flexibility and agility of IT environments while reducing the bottom line. In an ideal world, where time and budget are not limiting factors, upgrading an organization's infrastructure happens on an ongoing, as-needed basis.
In the real world, IT administrators have to make decisions about the hardware they put in place and how to maintain acceptable service levels over the course of the equipment's expected life. Most businesses do not have the luxury of replacing their current storage systems when IT demands outpace ...
Opining about the future of AI at the recent Brilliant Minds event at Symposium Stockholm, Google Executive Chairman Eric Schmidt rejected warnings from Elon Musk and Stephen Hawking about the dangers of AI, saying, “In the case of Stephen Hawking, although a brilliant man, he’s not a computer scientist. Elon is also a brilliant man, though he too is a physicist, not a computer scientist.”
This absurd dismissal of Musk and Hawking was in response to an absurd question about “the possibility of an artificial superintelligence trying to destroy mankind in the near future.” Schmidt went on to sa...
Cloud computing has taken over the business world! With almost maniacal focus, single proprietors and Board Directors of the world's largest conglomerates see this new model as a "must do". This rapid shift is, in fact, accelerating. As Jeff Bertolucci observes in "The Shift to Cloud Services Is Happening Faster Than Expected":
"According to the sixth annual Uptime Institute Data Center Industry Survey, which examines the big-picture trends shaping IT infrastructure delivery and strategy, the move to cloud services is accelerating. The Uptime Institute's February 2016 poll of more than 1,00...
The Dean of the University of San Francisco School of Management, Elizabeth Davis, recently asked me to sit on a Big Data panel at the Direct Sales Association conference. I was given a 5-minute slot to “demystify” Big Data to a non-technical group of about 1,000 people; to help them understand where and how this thing called “Big Data” could help them.
Well if you know me, I can barely introduce myself in 5 minutes. But this was particularly challenging for me, as I’m used to talking about Big Data with organizations with at least some level of Big Data experience or understanding (maybe the...
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as process and organization.
Data is an unusual currency. Most currencies exhibit a one-to-one transactional relationship. For example, the quantifiable value of a dollar is considered to be finite – it can only be used to buy one item or service at a time, or a person can only do one paid job at a time. But measuring the value of data is not constrained by those transactional limitations. In fact, data currency exhibits a network effect, where data can be used at the same time across multiple use cases thereby increasing its value to the organization. This makes data a powerful currency in which to invest.
It seems like every day we see reminders of the importance of thorough security testing from all areas of the software world. Security has become an especially critical consideration for APIs in recent years. Organizations rely on APIs to share and receive information - either from a third party or between internal applications - so the level of security between these applications is critical for anyone who uses them.
More than 95 percent of the world’s enterprises rely on SSH user keys to provide administrators and developers an effective means of gaining encrypted access to critical infrastructure: operating systems, applications, payment processing systems, databases, human resource and financial systems, routers, switches, firewalls and other network devices. It is a lifeline of traffic flow within our data centers, our cloud environments and how our third-party vendors and supply chain access our environments. It has done its job quietly and efficiently over the last two decades. Unfortunately, the acc...
Having automated workflows sounds like a sexy thing. And it is. It feels pretty awesome to be able to initiate a process with a button and forget about it while the work gets done.
But that doesn’t mean you have to automate every process in your office. I mean, I think people can take care of replacing the copy paper on their own, right? Right? Come on, people.
Because automation shouldn’t be blatantly applied to every corner of your business, it’s good to start with a very basic question before you begin.
There are a lot of ways to improve your business – why are you looking to autom...
I’ve heard several clients complain about the curse of “orphaned analytics”; which are one-off analytics developed to address a specific business need but never “operationalized” or packaged for re-use across the organization. Unfortunately, many analytic organizations lack a framework for ensuring that the analytics are not being developed in a void. Organizations lack an overarching model to ensure that the resulting analytics and associated organizational intellectual capital can be captured and re-used across multiple use cases.
Join Us at the Santa Clara Convention Center in New York, Santa Clara, November 1-3
Cloud computing software is eating the world, and each day is bringing new developments in this world.
Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and Infrastructure as a Service.
Big Data | Analytics and the emerging Internet of Things (IoT) are driving exponentially increased demands on datacenters and developers alike, as we cross the zettabyte horizon this year.
Containers and microservices are now part of every PaaS conversation, and IaaS providers are increasingly competing for platform customers.
WebRTC continues to reform web communications, and DevOps is pushing its way into an enterprise IT world that is increasingly agile, lean, and continuous.
Through all this, Cloud Expo remains the single independent event where delegates and technology vendors can meet to experience and discuss the entire world of the cloud.
Only Cloud Expo brings together all this in a single location:
• Cloud Computing
• Big Data | Analytics
• Internet of Things
• Containers | Microservices
Cloud computing budgets worldwide are reaching into the hundreds of billions of dollars, and no organization can survive long without some sort of cloud migration strategy. Each month brings new announcements, use cases, and success stories.
Cloud Expo offers the world's most comprehensive selection of technical and strategic Industry Keynotes, General Sessions, Breakout Sessions, and signature Power Panels. The exhibition floor features 100+ exhibitors offering specific solutions and comprehensive strategies.
The floor also features a Demo Theater that give delegates the opportunity to get even closer to the technology they want to see and the people who offer it.
Attend Cloud Expo. Create your own custom experience. Learn the latest from the world's best technologists. Talk to the vendors you are considering, and put them to the test.