By Kyle Keller, Cloud Business Director, EMC Federal
Most of us in federal IT realize that rarely (if ever) does a single solution meet the needs of application requirements across the enterprise. In the context of cloud, finding the right consumption and delivery model for applications needs to be based on functionality requirements, financial analysis, and overall risk profile.
In other words, cloud is not a “one size fits all” proposition. Many organizations pursue a “cloud of clouds” approach, leveraging multiple cloud environments to most efficiently deliver IT services. Some applications have security requirements that demand a private cloud infrastructure, while others are more conducive to an off-premises public cloud. After these decisions are made, based on risk, cost, and performance factors, the challenge becomes stitching together both internal and external cloud resources in a hybrid infrastructure capable of delivering a cohesive service catalog.
By John McCumber, Federal Technologist, RSA, the Security Division of EMC
When Santa received the letter below, he was taken aback. Did this miscreant really think Kris Kringle would go against his “naughty or nice” policy and grant wishes that compromise IT security? Not likely, and to help spread the word about “what not to do” when protecting sensitive data, Santa generously shared this rogue hacker’s wish list. As members of the federal IT community, let’s avoid these pitfalls as we head into 2014.
Formula 1 Racing teams are facing a serious challenge as 2014 approaches. The most disruptive rule changes in the sport’s history are forcing manufacturers to rethink nearly every aspect of engineering and design. Cars will have smaller engines, they’ll be more fuel efficient, and they’ll be safer. For the past few years, F1 teams have been innovating to meet these new standards without sacrificing on power, acceleration, or handling – and the smart ones are leaning on data to get the job done.
When I say data, I’m not just talking about lap times or gas mileage. Today’s F1 cars are equipped with more than 200 sensors that deliver real-time performance information across specific components like brakes and tires. In fact, these sensors produce more than 25 megabytes of data per lap, which adds up quickly during a full season of racing. To maximize the value of this data, the Lotus F1 team partnered with EMC to leverage the power of cloud computing and sophisticated analytics to better understand what’s working, what’s isn’t, and how design strategies need to evolve. And it’s the same approach our nation is employing to transform military engineering.
Much of the discussion around big data rightly centers on the technology tools that can empower organizations to collect, standardize, and analyze enormous amounts of information, but enterprises still need extremely talented people to employ these tools correctly, contextualize the findings, and make sound decisions based on the insights. Like cybersecurity, big data is more critical than ever to the success of organizations across both the public and private sectors, and it is rapidly evolving. Also like cybersecurity, the U.S. workforce doesn’t have enough people with the unique mix of skills to operate effectively in the current big data environment.
Who cares about big data? We know that tons of information is inundating federal agencies from new sources like sensors, mobile devices, satellites, and social media – but so what? The volume of data is irrelevant unless you can make sense of it in time to make better decisions. Fast data beats big data every time.
The difference is subtle but important. Big data involves collecting, storing, and analyzing information to figure out what happened after an event occurred, or what might happen in the future. Traditionally, this process takes time, and when the answers finally reveal themselves, it’s often too late to act.