Prescribing Cloud – Healthcare Goes Hybrid

Dave DeAngelis

Dave DeAngelis

General Manager, Global Healthcare Business

It’s good to have a plan.

Healthcare data is growing faster than ever before. At 48 percent each year, it’s one of the fastest growing segments in the Digital Universe. This data is coming from many sources – clinical applications, compliance requirements, population health, and FutureCare-enabling technologies for cloud, Big Data, mobile, and social – just to name a few.

Health IT needs a plan to manage and take advantage of all this information. More than ever before, a hybrid cloud model needs to be part of that plan.

On the Road to Cloud

According to a recent MeriTalk and EMC report, in 2015, 62 percent of health IT leaders are increasing cloud budgets to provide more coordinated, cost-effective care.

Where should they focus their IT budget? A hybrid cloud lets you place the right applications in the right cloud with the right cost and performance. And, it lets you protect and secure protected health information (PHI). The goal – eliminate data silos to gain a 360 degree view of the patient, in real-time, at the point of care.

A hybrid cloud consolidates and eliminates inefficient silos. Healthcare providers can balance clinical and business workloads in an enterprise hybrid cloud which incorporates private, managed private, and trusted, public clouds.

As Health IT starts this journey, other organizational objectives can get jump started. For example, Health IT is then better equipped to deploy a data lake for clinical collaboration, and an agile data and analytics platform for storage, management, and analysis, by bringing together data from different sources across multiple protocols.

As a result, you have the opportunity to deploy clinical predictive analytics for managing population health, reducing readmissions, and optimizing patient treatment plans.

And, with just 18 percent of providers running EMR applications partially or fully in a hybrid cloud today, opportunity lies ahead. To take advantage, Health IT organizations can begin with a converged infrastructure, which provides a fast track to an enterprise hybrid cloud computing by combining compute, network, and storage resources.

Health IT executives attending HIMSS15 are working on the frontlines to realize the promise of accountable care. We’re excited for the opportunity to come together to share new ideas and lessons learned.

In the end, the ultimate driver in healthcare is outcomes. Hybrid cloud improves IT outcomes by driving down costs. Once more cost is taken out of infrastructure, it can be reinvested in innovation. That, in turn, improves patient care outcomes.

And that sounds like a good plan.

This post was originally published on our sister blog Reflections, where senior leaders at EMC blog regularly on trends in information technology.

What’s Driving the Data Lake?

Bill Schmarzo

Bill Schmarzo

Chief Technology Officer, Enterprise Information Management Group

EMC’s Federation Business Data Lake (FBDL) announcement has been a long time in the making. It’s been a perfect storm of industry trends that enable Big Data and make data lakes a feasible data architecture option. These trends include:

Data Growth – Web applications, social media, mobile apps, sensors, scanners, wearable computing, and the Internet of Things are all generating an avalanche of new, more granular data about customers, channels, products, and operations that can now be captured, integrated, mined, and acted upon.

Cheap Storage – The cost of storage is plummeting, which enables organizations to think differently about data. Leading organizations are transitioning from viewing data as a cost to be minimized to valuing it as an asset to be hoarded. Even if they don’t yet know how they will use that data, they are transitioning to a “data abundance” mentality.

Limitless Computing – The ability to bring to bear an almost limitless amount of computing power to any business problem allows organizations to process, enrich, and analyze this growing wealth of data, uncovering actionable insights about their customers and their business operations.

Real-time Technologies – Low-latency data access and analysis is enabling organizations to identify and monetize “events in the moment” while there is still value in the freshness or recency of the event.

While this list is impressive, it is not complete. There are two other key industry trends that are driving Big Data and the data lake:

Open Source Software is democratizing software tools like Hadoop, R, Shark, YARN, Mahout, and MADlib, by putting these tools within the reach of any organization. Open source software is fueling innovation from startups and Fortune 500 organizations to universities and digital media companies; it is liberating organizations from being held captive by the product development cycles of traditional enterprise software vendors.

Many smart people have been working hard to pull the FBDL together and I am proud to say that I saw the data lake coming as early as May 2012 when I published my “Understanding the Role of Hadoop In Your BI Environment” blog post. Okay, okay, I originally called it a Hadoop-based “Operational Data Store,” but regardless of missing on the name, I got many of the key benefits right:

Hadoop brings at least two significant advantages to your ETL and data staging processes. The first is the ability to ingest massive amounts of data as-is. That means that you do not need to pre-define the data schema before loading data into Hadoop. This includes both traditional transactional data (e.g., point-of-sale transactions, call detail records, general ledger transactions, call center transactions), but also unstructured internal data (like consumer comments, doctor’s notes, insurance claims descriptions, and web logs) and external social media data (from social media sites such LinkedIn, Pinterest, Facebook and Twitter). So regardless of the structure of your incoming data, you can rapidly load it all into Hadoop, as-is, where it then becomes available for your downstream ETL, DW, and analytic processes.

The second advantage that Hadoop brings to your BI/DW architecture occurs once the data is in the Hadoop environment. Once it’s in your Hadoop ODS, you can leverage the inherently parallel nature of Hadoop to perform your traditional ETL work of cleansing, normalizing, aligning, and creating aggregates for your EDW at massive scale.

And finally, Data Science, which is the most exciting industry trend for me. Analytic tools combined with the volume, variety and velocity of data are converging with training and education, business-centric methodologies, and innovative thinking to enable organizations to “weave data hay into business gold” by uncovering customer, product and operational insights from data lakes that can be used to optimize key business processes and uncover new monetization opportunities.

What Does the Future Hold?

EMC’s Federation Business Data Lake takes a big step in the maturation of data lakes by leveraging Big Data industry trends to create a living “interconnected tissue” entity. The features outlined in the FBDL will fuel the business transformational processes that we are already seeing underway at many clients. But there still is a long way to go as tools, training, and methodologies continue to evolve, helping organizations think differently about the role of data and analytics to power their value creation processes.

This post was originally published on our sister blog Reflections, where senior leaders at EMC blog regularly on trends in information technology.

Goulden Outlines EMC II Strategy

David Goulden

David Goulden

Chief Executive Officer, EMC Information Infrastructure at EMC Corporation
Chief Executive Officer, EMC Information Infrastructure at EMC Corporation

David Goulden, CEO of EMC Information Infrastructure, outlined EMC’s strategy in storage, flash, data protection, converged infrastructure, enterprise hybrid cloud, and security analytics at a forum held in New York on March 10, 2015. Watch an excerpt from the presentation.

This is the second post in a four-part series from EMC Federation leaders.

This post was originally published on our sister blog Reflections, where senior leaders at EMC blog regularly on trends in information technology.

Safety First

Karen DelPrete

Karen DelPrete

Director, Federal Marketing

You probably wouldn’t leave your house without locking the doors. Or ride a motorcycle without a helmet. And you most certainly wouldn’t leave your agency’s data unprotected. What do these scenarios have in common? They are all protective measures people take every day to prevent disastrous situations.

More specifically, protective measures can help Feds avoid a data disaster as a result of downtime or a data loss event.

Organizations are built on data, yet protective measures are frequently overlooked or overtaken by other priorities. Well, it may be time to view data protection in a new light – according to the 2014 Data Protection Global Index, 64% of organizations experienced at least one instance of unplanned downtime or data loss in the past year. Public sector organizations are hit the hardest of all. The study found that compared to any other industry, the public sector experiences more downtime – 37 hours as compared to 25 hours – and the longest recovery time – 12.43 hours as compared to 7.95 hours. These numbers give us an important message – it’s time for data protection to rise to the top of organizations’ IT priority lists and become part of their overall cyber security measures.

Cyber attacks might be inevitable, but with the right solutions in place, agencies can avoid disaster. Agencies must be resilient and utilize innovative solutions that enable automation and orchestration. Not only do automated environments reduce the time and cost invested in protective measures, but they also reduce error rates, such as the unintentional release of sensitive data.

So, what does a resilient organization look like? They stay ahead of the game by implementing advanced security, continuous availability, and integrated backup and recovery. To get started on the journey to optimal data protection, agencies should consider the following steps: Create a resiliency plan, update existing architecture, embrace hybrid automation and orchestration, modernize service level agreements, map to governance and compliance requirements, and implement data protection-as-a-service. For a deep dive into these steps and guidance on how to secure your agency on the road to mission resiliency, check out our eBook on the topic.

Keeping data safe isn’t impossible. Agencies just need to make a plan – today.

Learn more about the importance of data protection through our eBook, and a recent article on the subject.

FedRAMP Gives VMware Technologies a Thumbs Up

Karen DelPrete

Karen DelPrete

Director, Federal Marketing

In case you missed it, the Federal Risk and Authorization Management Program (FedRAMP) has approved VMware’s vCloud Government Service provided by Carpathia, enabling civilian and defense agencies to leverage vSphere technology with the Infrastructure-as-a-Service hybrid cloud.

This is a significant development in helping agencies reduce costs, expand IT resources and services, and improve service levels for users. FedRAMP ensures secure cloud computing for the Federal government, and the announcement means that VMware has achieved the Provisional Authority to Operate (ATO).

By using the vCloud service, organizations can move workloads seamlessly between their internal resources and the cloud, augment existing IT infrastructure capacity, and enhance continuity of operations and disaster recovery. What does this mean for your agency? Learn more by checking out VMware’s blog post on the announcement: http://vmw.re/1uEw4Mk