If a Picture’s Worth a Thousand Words, How About 30 Million Pictures?

Audie Hittle

Audie Hittle

Federal CTO, Emerging Technologies Division, EMC Corporation

Picture this – there is one surveillance camera for every 10 people in the U.S. – that’s close to 30 million cameras, and growing. As Feds experience a swelling flood of video data, the question looms: How can they manage, analyze, and secure it – unlocking insights to protect our nation?

To help out, we teamed up with MeriTalk’s Bill Glanz to explore the impact of the video data deluge in EMC’s newest video blog, “Focal Point: Video Data Explosion.”

What better way to explore the video surveillance landscape than through a video blog?

The Big Picture

The video surveillance market is expected to grow to $26B in 2018, according to market research by IHS Technologies, and who watches move TV than Uncle Sam? One thing is increasingly clear – video data is invaluable to agencies. According to MeriTalk’s new study, “The Video Vortex,” a resounding 99% of Feds say video is key to preventing crime, theft, and terrorism over the next five years – which shows just how crucial it is that Feds glean as much insight as possible from their video data.

Now here’s the scary part – despite its importance to national security, 54% of Federal video surveillance data currently goes unanalyzed. Imagine the possibilities if Feds analyzed it all – facial recognition, anomaly detection, traffic monitoring, and the list goes on…

What’s Blocking Their View?

MeriTalk asked the Feds, and they point to IT infrastructure as a key barrier to full data analysis. Right now, 91% say they need increased storage, 89% say increased computing power, and 84% say increased personnel – all important items on the Federal shopping list.

Another key component to the video data solution? Collaboration. 79% of Feds believe increased collaboration between physical security and IT will lead to greater video analysis – meaning more data insights. But, there’s another barrier. 76% of physical security managers currently see video surveillance programs as a collaboration – but few IT managers agree (just 33%). To move forward, everyone needs to be on the same page.

Light at the End of the Tunnel

These statistics provide great insight, but what’s a Fed to do when their agency can’t handle current video volumes, much less the data deluge that’s expected to come down the pike?

Don’t let your data headache turn into a migraine. Join us on June 11 at 1:30 p.m. ET for a complimentary webinar to take a deeper dive into “The Video Vortex” findings. You’ll get direct insight from Feds and walk away with new ideas on how to approach the video data influx. Then checkout EMC’s video surveillance core and edge solutions including the Isilon Scale-Out Data Lake and the EMC VNX-VSS100 for simple, efficient, and flexible options.

Not Your Father’s EMC

Jeremy Burton

Jeremy Burton

President, Products & Marketing

Looking back at press coverage coming out of EMC World last week there’s one phrase that pops up time and time again – “Not your father’s EMC”. This clearly struck a chord with many of our customers and employees alike and a number of you have asked me for more.

First of all, there are many great things about EMC that don’t need to change! There is a great heritage inside the company of doing what it takes to the keep customers happy… we have great relationships with many of the world’s largest companies and governments… and we have an expansive best-of-breed technology portfolio.

But the world is changing.

The way products will be built, evaluated, marketed, sold, used, serviced and supported is different in the 3rd platform. These changes force us to reevaluate everything we know about the traditional product lifecycle.

Let’s start with building products. EMC’s traditional products – storage arrays – will be used for many years to come underneath traditional data center applications, Oracle databases, and the like. But for new 3rd platform applications, much of the value within the infrastructure will be delivered entirely through software… running on common off-the-shelf hardware. We believe that much of this software will be created using community-based development – “open source.”

The benefits to the customer are clear – more features, more quickly, without lock-in. “And free?” I hear you say. Not necessarily. I still believe that most customers will want to buy a complete working system (hardware + software + service) and for that they will be happy to pay. I do not believe we are heading back to a world where organizations buy component parts to spend days and weeks doing self-assembly.

With that in mind, last week, we announced the CoprHD open source project, essentially a release of the ViPR Controller source code into the community. I’ve been very clear that this project is merely the first we’ve picked and it is a part of a much more expansive open source effort you’ll see roll out over the next year.

Releasing the intellectual property of one of EMC’s mainstream products into the world of open source is not something we’ve ever done before. It’s a first – this is clearly not your fathers’ EMC.

Next, evaluating products. I’ve long believed that the people who evaluate and use our products are not the people who buy them. Usually the ‘buying’ is done by corporate procurement. As the infrastructure world moves increasingly toward software it should be much easier for folks who evaluate and use our software to simply download the binaries and get going. They should not have to wait for a license agreement to be in place before doing so.

With that in mind, last week, we announced the free download of ScaleIO – a software-defined block storage offering. It isn’t time-bombed. It isn’t feature-limited. It’s free for non-production use. Our belief is that if the users of our software like it, then they’ll recommend it and their procurement team will buy it for production use. Like open source, you should consider this free download as merely a first step and look forward to a much more expansive set of downloads over the next year.

Releasing unlimited, full featured commercial software onto EMC.com for free download is not something we’ve ever done before. It’s a first – this is clearly not your fathers’ EMC.

But we’re not stopping there. We’re moving aggressively towards online/social marketing, quoting and transacting through our web store, publishing our product documentation so it’s searchable by Google, educating through MOOCs and supporting through online discussion forums and communities. The goal? To eliminate as much friction as possible at every stage in the product lifecycle – making it easier for our customers and our partners to interact with EMC.

Three years from now, I expect some parts of EMC will be the same as today, but many parts will look and feel very different from what we’ve always known. We’ll have the same high standard for quality, close relationships with customers, and we will continue to be a trusted place where people send their data for safe keeping, because their data is going to be their business. Those are the traits we want to keep. But the way we develop our products, the way we release our products, the way we sell them and the way we market them will be different. In the coming months and years, we’re going to adopt more digital techniques to give our customers the kind of experience they want. And that experience will not be your father’s EMC.

This post was originally published on our sister blog Reflections, where senior leaders at EMC blog regularly on trends in information technology.

Real-time Actionable Information Sharing – A Mission Critical Capability in the Cyber Battle

Mike Brown

Mike Brown

Vice President & General Manager, RSA Global Public Sector

In April 18, 1775, Boston and a soon to be fledgling nation faced a certain threat against a massive attack surface originating from thousands of miles away via the sea. At the onset of the American Revolutionary War, Paul Revere collaborated with volunteers at Boston’s Old North Church to hang either one or two lanterns on the church steeple. Their code communicated to other communities and organizations – in real-time – the attack-vector British troops were using for their approach. This first example of public-private information sharing is immortalized in the famous line of the Henry Wadsworth Longfellow poem:  “One if by land, and two if by sea.

Then, as now, information sharing is a critical tool in the major threats businesses face. The nature of the cyber threat is different than anything else we’ve ever known or have been able to address in our business and legal systems – so, like Paul Revere, organizations have to think about it differently and craft different solutions. The U.S. Congress is meeting this week to consider a legislative framework for approaching cyber threats. Elected representatives are debating the merits and content of legislation concerning, among many things, information sharing and liability relief. Both of those topics are mission critical in today’s environment to counter threats faced today, and those that will be faced tomorrow.

Real-time actionable cyber threat information sharing between and among private and public sectors is needed to address diverse technology and business objectives. Through effective open and robust information sharing, organizations have a better success rate against the effects of malicious actors. Working together we maximize the reach of our cyber workforce in defending the public and private sectors from an ever changing threat environment.

We need to accept that current advanced protections don’t work. Furthermore, without evolving the security model – they will continue to not work. We know that point products, signature-based defensive approaches, and even traditional strategies are not enough to address the challenge. To overcome the threat posed by adversaries we need real-time information sharing across the public and private sectors. And of course, this data must be consumed, understood, and acted on by advanced security teams capable of processing it immediately.

Information sharing, and the pending legislation, should allow the effective dissemination of near real-time actionable information, hopefully machine readable, that can assist new efforts to defeat malicious actors. We need this information – threat intelligence – because the old strategies of protecting the perimeters don’t work. We need visibility, access, and agility to see what the malicious actors are doing in our networks. Yes – they are in our networks. We need to prevent them from succeeding in their ultimate objectives. Information sharing will assist our ability to quickly detect and respond to these malicious actors and Congressional action should support those operational principles.

Today, 240 years after Paul Revere’s midnight ride, society may not be recognizable, but the principles that those American Patriots and Sons of Liberty espoused are visible. The cyber threat highlights one similarity: we, people and organizations around the world, face an existential threat to our way of life that can only be mitigated by a cooperative approach. Private companies, and governments, alone cannot overcome the myriad threats we face – they don’t have the resources or capabilities.  Hopefully, current legislative action will help achieve what is needed to preserve and protect the principles fought for so many years ago.

This post was originally published on our sister blog Reflections, where senior leaders at EMC blog regularly on trends in information technology.

Prescribing Cloud – Healthcare Goes Hybrid

Dave DeAngelis

Dave DeAngelis

General Manager, Global Healthcare Business

It’s good to have a plan.

Healthcare data is growing faster than ever before. At 48 percent each year, it’s one of the fastest growing segments in the Digital Universe. This data is coming from many sources – clinical applications, compliance requirements, population health, and FutureCare-enabling technologies for cloud, Big Data, mobile, and social – just to name a few.

Health IT needs a plan to manage and take advantage of all this information. More than ever before, a hybrid cloud model needs to be part of that plan.

On the Road to Cloud

According to a recent MeriTalk and EMC report, in 2015, 62 percent of health IT leaders are increasing cloud budgets to provide more coordinated, cost-effective care.

Where should they focus their IT budget? A hybrid cloud lets you place the right applications in the right cloud with the right cost and performance. And, it lets you protect and secure protected health information (PHI). The goal – eliminate data silos to gain a 360 degree view of the patient, in real-time, at the point of care.

A hybrid cloud consolidates and eliminates inefficient silos. Healthcare providers can balance clinical and business workloads in an enterprise hybrid cloud which incorporates private, managed private, and trusted, public clouds.

As Health IT starts this journey, other organizational objectives can get jump started. For example, Health IT is then better equipped to deploy a data lake for clinical collaboration, and an agile data and analytics platform for storage, management, and analysis, by bringing together data from different sources across multiple protocols.

As a result, you have the opportunity to deploy clinical predictive analytics for managing population health, reducing readmissions, and optimizing patient treatment plans.

And, with just 18 percent of providers running EMR applications partially or fully in a hybrid cloud today, opportunity lies ahead. To take advantage, Health IT organizations can begin with a converged infrastructure, which provides a fast track to an enterprise hybrid cloud computing by combining compute, network, and storage resources.

Health IT executives attending HIMSS15 are working on the frontlines to realize the promise of accountable care. We’re excited for the opportunity to come together to share new ideas and lessons learned.

In the end, the ultimate driver in healthcare is outcomes. Hybrid cloud improves IT outcomes by driving down costs. Once more cost is taken out of infrastructure, it can be reinvested in innovation. That, in turn, improves patient care outcomes.

And that sounds like a good plan.

This post was originally published on our sister blog Reflections, where senior leaders at EMC blog regularly on trends in information technology.

What’s Driving the Data Lake?

Bill Schmarzo

Bill Schmarzo

Chief Technology Officer, Enterprise Information Management Group

EMC’s Federation Business Data Lake (FBDL) announcement has been a long time in the making. It’s been a perfect storm of industry trends that enable Big Data and make data lakes a feasible data architecture option. These trends include:

Data Growth – Web applications, social media, mobile apps, sensors, scanners, wearable computing, and the Internet of Things are all generating an avalanche of new, more granular data about customers, channels, products, and operations that can now be captured, integrated, mined, and acted upon.

Cheap Storage – The cost of storage is plummeting, which enables organizations to think differently about data. Leading organizations are transitioning from viewing data as a cost to be minimized to valuing it as an asset to be hoarded. Even if they don’t yet know how they will use that data, they are transitioning to a “data abundance” mentality.

Limitless Computing – The ability to bring to bear an almost limitless amount of computing power to any business problem allows organizations to process, enrich, and analyze this growing wealth of data, uncovering actionable insights about their customers and their business operations.

Real-time Technologies – Low-latency data access and analysis is enabling organizations to identify and monetize “events in the moment” while there is still value in the freshness or recency of the event.

While this list is impressive, it is not complete. There are two other key industry trends that are driving Big Data and the data lake:

Open Source Software is democratizing software tools like Hadoop, R, Shark, YARN, Mahout, and MADlib, by putting these tools within the reach of any organization. Open source software is fueling innovation from startups and Fortune 500 organizations to universities and digital media companies; it is liberating organizations from being held captive by the product development cycles of traditional enterprise software vendors.

Many smart people have been working hard to pull the FBDL together and I am proud to say that I saw the data lake coming as early as May 2012 when I published my “Understanding the Role of Hadoop In Your BI Environment” blog post. Okay, okay, I originally called it a Hadoop-based “Operational Data Store,” but regardless of missing on the name, I got many of the key benefits right:

Hadoop brings at least two significant advantages to your ETL and data staging processes. The first is the ability to ingest massive amounts of data as-is. That means that you do not need to pre-define the data schema before loading data into Hadoop. This includes both traditional transactional data (e.g., point-of-sale transactions, call detail records, general ledger transactions, call center transactions), but also unstructured internal data (like consumer comments, doctor’s notes, insurance claims descriptions, and web logs) and external social media data (from social media sites such LinkedIn, Pinterest, Facebook and Twitter). So regardless of the structure of your incoming data, you can rapidly load it all into Hadoop, as-is, where it then becomes available for your downstream ETL, DW, and analytic processes.

The second advantage that Hadoop brings to your BI/DW architecture occurs once the data is in the Hadoop environment. Once it’s in your Hadoop ODS, you can leverage the inherently parallel nature of Hadoop to perform your traditional ETL work of cleansing, normalizing, aligning, and creating aggregates for your EDW at massive scale.

And finally, Data Science, which is the most exciting industry trend for me. Analytic tools combined with the volume, variety and velocity of data are converging with training and education, business-centric methodologies, and innovative thinking to enable organizations to “weave data hay into business gold” by uncovering customer, product and operational insights from data lakes that can be used to optimize key business processes and uncover new monetization opportunities.

What Does the Future Hold?

EMC’s Federation Business Data Lake takes a big step in the maturation of data lakes by leveraging Big Data industry trends to create a living “interconnected tissue” entity. The features outlined in the FBDL will fuel the business transformational processes that we are already seeing underway at many clients. But there still is a long way to go as tools, training, and methodologies continue to evolve, helping organizations think differently about the role of data and analytics to power their value creation processes.

This post was originally published on our sister blog Reflections, where senior leaders at EMC blog regularly on trends in information technology.