All posts by Pierre-Yves Poli

Will this be the year of Hadoop? 6 predictions for 2015

January 8, 2015
Mike Wheatley

With the New Year finally upon us it seems as good a time as any to ask where Hadoop, the open-source Big Data framework, will be heading in 2015.

SiliconANGLE pulled forecasts from an assortment of analysts and industry experts who’ve tried to second guess the next big developments in Hadoop, and the overwhelming consensus is that adoption will accelerate within the enterprise, as more businesses build smart applications with real-time data analysis capabilities atop of the platform.

1. More market consolidation

Only ten years have passed since Google published its MapReduce whitepapers, notes MapR CEO and Co-founder John Schroeder, which means Hadoop is still at a relatively youthful stage of the technology maturity life cycle. What we can expect to see throughout 2015 is Hadoop enter a period of consolidation, with the number of vendors fighting for a piece of the action narrowing down.

“Hadoop is early in the technology maturity life cycle,” said Schroeder. “In 2015, we will see the continued evolution of a new, more nuanced model of OSS to combine deep innovation with community development. The open-source community is paramount for establishing standards and consensus. Competition is the accelerant transforming Hadoop from what started as a batch analytics processor to a full-featured data platform.”

Saggi Neumann, CTO of Xplenty, agreed, telling SiliconANGLE that: “2015 will see more Big Data acquisitions and buyouts than ever before – In 2014, we witnessed the acquisitions of XA Secure, Hadapt, RainStor, DataPad and a few others. Both Cloudera and Hortonworks are now billion dollar companies and are eagerly looking to acquire more brains and technologies. Other giants such as HP, IBM, Oracle, Pivotal and Microsoft are knee-deep in Hadoop business and we’ve yet to see the end of M&As in the category.”

2. Enterprise adoption to gather pace

Forrester Research has already put its reputation on the line and said Hadoop will become anenterprise priority in 2015, and the experts tend to share that opinion. According to Gary Nakamura, CEO of Concurrent, Inc., Hadoop is all set to become a “worldwide phenomenon” in 2015.

“Hundreds of thousands of data points reported from the Cascading ecosystem support the notion that Hadoop is rapidly spreading across Europe and Asia and soon in other parts of the world,” said Nakamura. “Therefore, there will be a strong Hadoop adoption next year for enterprises ramping up their data strategy around Hadoop, creating new jobs, and further disrupting the data market worldwide.”

Larent Bride, CTO, Talend, said a growing number of enterprises will begin deploying Hadoop in more than just proof-of-concept environments. Hadoop will be used for day-to-day operations,” he said. “Organizations are still exploring how best to adopt Hadoop as the primary data warehouse technology. But as Hadoop is used more and the capabilities of YARN become fully realized, more useful opportunities leveraging technology like Apache Spark and Storm will emerge and quickly increase its potential. Even now, real-time/operational analytics are the fastest moving part of the Hadoop ecosystem, and it’s becoming evident that by 2020 Hadoop will be relied on for day-to-day enterprise operations.”

3. SQL to become a “must-have” with Hadoop

The SQL data querying language tool that’s so popular with developers will become one of the most popular applications for Hadoop, reckon the experts.

“Fast and ANSI-compliant SQL on Hadoop creates immediate opportunities for Hadoop to become a useful data platform for enterprises,” said Mike Gualtieri of Forrester Research. This will provide a sandbox for analysis of data that is not currently accessible.

Mike Hoskins, CTO at Actian, agreed, telling SiliconANGLE that: “SQL will be a “must-have” to get the analytic value out of Hadoop data. We’ll see some vendor shake-out as bolt-on, legacy or immature SQL on Hadoop offerings cave to those that offer the performance, maturity and stability organizations need.”

4. No more Hadoop skills shortage

One of the more surprising developments Forrester expects is that the Hadoop skills shortage, which has been so well documented in the last couple of years, will evaporate in 2015. “CIOs won’t have to hire high-priced Hadoop consultants to get projects done,” noted Forrester’s report. “Hadoop projects will get done faster because the enterprise’s very own application developers and operations professionals know the data, the integration points, the applications and the business challenges.”

Key to this will be SQL on Hadoop, Gualtieri said, as it will open the door to familiar access with Hadoop data. Meanwhile, commercial vendors and the open-source community alike are both building better tools to make Hadoop easier for everyone to use.

5. Architecting around Hadoop

Cloudera chief technologist Eli Collins told SiliconANGLE he expects to see more users “architecting around Hadoop”, for example by using it as an Enterprise Data Hub, rather than just using it for bespoke operations. He also expects more users to consume Hadoop via it being embedded in larger applications.

“Analytics is becoming an important part of a lot of applications, often a core part of the application itself so we’ll continue to see more of this,” said Collins. “Customers are applying Hadoop across a lot of industries as they’ve been doing for the last several years, they’re just adopting it more extensively as the platform becomes more capable, more accessible and better integrated with the other technologies they use.”

6. Rise of Hadoop + real-time analytics

As enterprises increase their contribution to the Hadoop ecosystem’s rising growth, and as Hadoop becomes a more attractive alternative to traditional database vendors, the demand for real-time and transactional analytics will rise significantly in 2015, said Ali Ghodsi, head of product management and engineering at Databricks.

“In 2015, enterprises will continue to evolve from the initial incarnation of making use of data through offline operations and signification manual intervention, to one in which organizations will make decisions on streaming data itself in real-time, whether that be through anomaly detection, internet of things, etc.,” said Ghodsi.

“Enterprises will need infrastructures that can scale and ingest any type and size of data from any source and perform a variety of advanced analytics techniques to identify meaningful insights in the necessary amount of time to make an impact on the business. The rise of compatible process engines such as Apache Spark will further enable Hadoop to help address these needs. This year, the approach to analytics will be more predictive to operational and relational.”

Year in Review: The Biggest Developments of 2014

December 29, 2014

What a difference a year makes! Over the past 12 months, we’ve witnessed the beginning of the end for traditional enterprise software. All the attention seems focused on Big Data, which belies the lingering array of data quality issues that continue to thwart efforts for handling good old-fashioned corporate “small” data. That said, the future of the data management biz has never been brighter, as evidenced by the monster-truck-sized investment that Cloudera received from Intel — a whopping $740 million for an 18% stake. Register for this episode of DM Radio to hear Host Eric Kavanagh interview Seth Proctor of NuoDB, Gary Nakamura of Concurrent, and two special guests.

A Decade On: The Evolution of Hadoop at Age 10

December 22, 2014
Scott Etkin

Apache Hadoop turns 10 in 2015. What started as an open-source project intended to enable Yahoo! Internet searches has become, in a relatively short time, the de facto architecture for today’s big data environments.

As big data exploded in 2014, Hadoop adoption and investment expanded along with it. Today, Hadoop is deployed across industries including advertising, retail, healthcare, social media, manufacturing, telecommunications, and government. But it won’t be long before companies begin demanding to see a return on their Hadoop investments.

“Hadoop has been rapidly adopted as the way to execute any go-forward data strategy,” said Gary Nakamura, CEO of Concurrent, Inc. “However, early adopters must now show return on investment, whether its migrating workloads from legacy systems or new data applications. Luckily, products and tools are evolving to keep pace with the trajectory of Hadoop.”

Indeed, Hadoop experts see the platform continuing to evolve and grow in 2015.

MapR Technologies CEO and co-founder John Schroeder predicts that, in 2015, new Hadoop business models will evolve and others will exit the market.

“We are now 20 years into open-source software adoption that has provided tremendous value to the market,” said Schroder. “The technology lifecycle begins with innovation and the creation of highly differentiated products, and ends when products are eventually commoditized.

“Hadoop adoption globally and at scale is far beyond any other data platform just 10 years after initial concept,” he added. “In 2015, we’ll see the continued evolution of a new, more nuanced model of open-source software to combine deep innovation with community development. The open-source community is paramount for establishing standards and consensus. Competition is the accelerant transforming Hadoop from what started as a batch analytics processor to a full-featured data platform.”

Steve Wooledge, Vice President of Product Marketing at MapR, said he sees Hadoop-based data lakes and data hubs becoming the norm in enterprise data architectures in 2015, and self-service data exploration going mainstream.

“Hadoop as a data hub or data lake is a very standard and introductory use case for most organizations,” said Wooledge. “Companies are not sure what value there may be in untapped data sources, such as machine logs from the data center, social media, or mobile interactional data, but they want to harness the data and look for new insights, which they can inject into business processes and operationalize.

Schroeder agreed.

“In 2015, data lakes will evolve as organizations move from batch to real-time processing and integrate file-based, Hadoop, and database engines into their large-scale processing platforms. In other words, it’s not about large-scale storage in a data lake to support bigger queries and reports. The big trend in 2015 will be around the continuous access and processing of events and data in real time to gain constant awareness and take immediate action.”

Ron Bodkin, founder and CEO of Think Big Analytics, said Hadoop will outgrow MapReduce in the coming year and Spark will grow in importance.

“One of the first things that we can expect from 2015 is that Hadoop clusters will start to benefit from other programming models besides MapReduce to deal with large data sets,” he said. “We already saw YARN begin to gain momentum in 2014 when it got across-the-board support from distribution providers like Cloudera as well as Hortonworks. Expect that this investment will begin to pay off in 2015 as more customers start leveraging YARN’s ability to support alternative execution engines, such as Apache Spark.”

Now that Hadoop has matured and gained widespread adoption, Bodkin said that the coming year could see late adopters finally feeling bold enough to embrace Hadoop.

“Hadoop has long since broken free of its web giant and ad tech heritage, penetrating most industries – notably music as streaming became ubiquitous,” said Bodkin. “In 2015, even late adopters will turn their attention to Hadoop, so expect an uptick in cost-driven implementations around better storage and faster load-times: SAN/NAS augmentation, ETL offload, and mainframe conversions.”

Monte Zweben, co-founder and CEO of Splice Machine, sees Hadoop evolving in the direction of concurrent applications in 2015.

“Concurrent Hadoop-based applications will become more prevalent in 2015 because of their ability to access real-time data and process transactions like a traditional RDBMS,” he added. “Emerging technologies that allow concurrent transactions on Hadoop enable data scientists and applications to work with more recent and accurate information instead of data that is hours or days old from batch processing. This is a major step in Hadoop’s ongoing evolution to meet the needs of businesses with mission-critical database applications that are having trouble cost-effectively scaling to meet higher data volumes.”“Big data has bloomed in 2014 as enterprises have invested in platforms like Hadoop. As we enter 2015, getting more out of those initial big data investments will grow as a top priority for businesses,” Zweben said. “Increased competitive pressures and the current appetite for real-time information no longer allows for the old model of waiting for data scientists to take hours or days to generate insights based on out-of-date information. New developments in the Hadoop platform can power applications that can act on insights now, instead of later, and with more recent data.

Spark 1.2 challenges MapReduce’s Hadoop dominance

December 22, 2014
Serdar Yegulalp

Apache Spark, the in-memory and real-time data processing framework for Hadoop, turned heads and opened eyes after version 1.0 debuted. The feature changes in 1.2 show Spark working not only to improve, but to become the go-to framework for large-scale data processing in Hadoop.

Among the changes in Spark 1.2, the biggest items broaden Spark’s usefulness in multiple ways. A new elastic scaling system allows Spark to better use cluster nodes during long-running jobs, which has apparently been requested often for multitenant environments. Spark’s streaming functionality, a major reason why it’s on the map in the first place, now has a Python API and a write-ahead log to support high-availability scenarios.

The new version also includes Spark SQL, which allows Spark jobs to perform Apache Hive-like queries against data, and it can now work with external data sources via a new API. Machine learning, all the rage outside of Hadoop as well, gets a boost in Spark thanks to a new package of APIs and algorithms, with better support for Python as a bonus. Finally, Spark’s graph-computing API GraphX is out of alpha and stable.

Spark’s efforts to ramp up and expand speak to two ongoing efforts within the Hadoop world at large. The first is to shed the straitjacket created by legacy dependencies on the MapReduce framework and move processing to YARN, Tez, and Spark. Gary Nakamura, CEO of data-application infrastructure outfit Concurrent, believes the “proven and reliable” MapReduce will continue to dominate production over Spark (and Tez) in the coming year. However, MapReduce’s limitations are hard to ignore, and they put real limitations on the work that can be done with it.

Another development worth noting is Python’s expanding support for Spark — and Hadoop. Python’s popularity with number-crunchers remains strong and is ideal for use in Hadoop and Spark, but most of Python’s support there has remained confined to MapReduce jobs. Bolstering Spark’s support for Python broadens its appeal beyond the typical enterprise Java crowd and with Hadoop in general.

Much of Spark’s continued development has come through contributions from Hadoop shop Hortonworks. The company has deeply integrated Spark with YARN, is adding security and governance by way of the Apache Argus project, and is improving debugging.

This last issue has been the focus of criticism in the past, as programmer Alex Rubinsteyn has cited Spark for being difficult to debug: “Spark’s lazy evaluation,” he wrote, “makes it hard to know which parts of your program are the bottleneck and, even if you can identify a particularly slow expression, it’s not always obvious why it’s slow or how to make it faster.”

10 Predictions for Data and Analytics in 2015

December 19, 2014
Adam Shepherd

As analytics continues to play a larger role in the enterprise, the need to leverage and protect the data looms larger. According to the IDC, the big data and analytics market will reach $125 billion worldwide in 2015.  Here are 10 predictions from industry experts about the data and analytics in 2015.

  1. Hadoop – Hadoop will become a worldwide phenomenon, believes Concurrent CEO Gary Nakamura, who notes that Hadoop has shown tremendous growth throughout Europe and Asia, and that expansion will only continue. A key to Hadoop being able to become an enterprise backbone, is the ROI businesses can expect from using it, and products and tools  continue to evolve to keep pace with the technology’s trajectory. According to Actian, SQL will be a “must-have” to get the analytic value out of Hadoop data.  We’ll see some vendor shake-out as bolt-on, legacy or immature SQL on Hadoop offerings cave to those that offer the performance, maturity and stability organizations need.
  2. Enterprise Security – With the seemingly never-ending stream of news reports of hacks and data leaks, one of the major data issues of 2014 that we can expect to continue in 2015 is big data breaches. “There is nothing you can do to stop a zero-day vulnerability, but the question is what do we do about it,” stated Walker White, president of data as a service provider BDNA. At this point it isn’t about keeping the hackers out, but how companies react to protect their data once the hackers have penetrated their systems. “Security ultimately is an arms race, there are very few mechanisms that simply can’t be broken, it tends to just be how far ahead can you stay of the people that are trying to break in,” agreed Seth Proctor, CTO of NuoDB.
  3. Business Intelligence – The growth of BI tools that are more user friendly for the average business employee will help to take some of the burden of IT teams. To do this, more BI providers will incorporate search into their interfaces to make the tools more accessible to average business users, according to Thoughtspot CEO Ajeet Singh.
  4. Cloud – The cloud will increasingly become the deployment model for BI and predictive analytics, particularly with the private cloud powered by the cost advantages, according to Actian.
  5. Hybrid Architecture – Hybrid architectures will become the norm for many organizations, according to Steven Riley of Riverbed Technology. Even though cloud computing and third-party hosting will continue their rapid expansion, on-premise IT will remain a reality for 2015 and beyond.  “In the coming year, analytics will have the power to become the next killer app to legitimize the need for hybrid cloud solutions,” adds Revolution Analytics CEO Dave Rich. Analytics has the ability to mine vast amounts of data from diverse sources, deliver value and build predictions without huge data landfills. In addition, the ability to apply predictions to the myriad decisions made daily – and do so within applications and systems running on-premises – is unprecedented.”
  6. Medical Data – When the average person thinks of their most important personal data security, most think about credit card information. Bitglass, which provides security for cloud apps and mobile devices, believes that medical records are 50 times more valuable on the black market than credit cards. Bitglass predicts that medical records will become a bigger target for data attacks than traditional methods such as credit cards. This will result in scrutiny pertaining to HIPAA regulations. Regulations stipulate that health organizations must report data breaches that affect more than 500 people.
  7. Data Science – As organizations gain a greater appreciation of the role that that data is playing data scientists are in greater demand, yet there are not enough qualified data scientists, according to EXASOL, an in-memory database company. Joe Caserta of Caserta Concepts believes that chief analytics officers (CAO) will now play role in the enterprise. As data-rich organizations continue to adopt a more strategic approach to big data, it makes sense that the responsibility for all that information needs to sit with someone who can apply the analytics big picture to all parts of the organization – the CAO. The coming year will be time for data-driven organizations to dedicate resources and executive commitment to the function.
  8. Internet of Things – OpenText predicts consumers will begin to become more aware of the IoT all around them – from smart watches to cars with built-in sensors, and Vormetric, a provider of security solutions, believes that the IoT will trigger a greater enterprise emphasis on securing big data using encryption. More personalized private data will be stored and analyzed by data analysis tools in the future.
  9. Location Data – Technologies will emerge in 2015 – full stack virtualization, pervasive visibility, and hybrid deployments – that will create a form of infrastructure mobility that allows organizations to optimize for location of data, applications, and people, says Riley of Riverbed. He predicts that organizations that begin to disperse their data to multiple locations will begin to gain significant competitive advantages.
  10. NewSQL – NewSQL will start taking the place of some RDBMSs, according to Morris of NuoDB, who believes that NewSQL will begin to support enterprise-scale applications that traditionally were only held by RDBMSs.

Data and analytics will only become more important and valuable to the enterprise. As the technologies for putting data to greater use continue to multiply, it is clear that those opportunities also carry risk, and there is the need to better protect the data that is being amassed.

Big Data: 6 Bold Predictions For 2015

December 19, 2014
Jeff Bertolucci

‘Tis the season when industry soothsayers don their prognostication caps to make fearless forecasts for the coming year. What does the crystal ball say about big data?

We culled an assortment of 2015 predictions from big data executives and analysts. The overarching theme: Big data gets real next year, as does the Internet of Things. What do you think? Will these prophecies come true, or are better suited for the Psychic Friends Network?

Prediction #1: Big data proves it’s more than just big hype.
“In 2014 the booming ecosystem around Hadoop was celebrated with a proliferation of applications, tools, and components. In 2015, the market will concentrate on the differences across platforms and the architecture required to integrate Hadoop into the data center and deliver business results.” — MapR CEO and cofounder John Schroeder.

Prediction #2: On a similar note, 2015 will be Hadoop’s “show me the money” year.
“Hadoop has been rapidly adopted as ‘the way’ to execute any go-forward data strategy. However, early adopters must now show return on investment, whether it’s migrating workloads from legacy systems or new data applications. Luckily, products and tools are evolving to keep pace with the trajectory of Hadoop.” — Gary Nakamura, CEO of Concurrent.

Prediction #3: Location services move indoors.
“Indoor location technology and services will rapidly gain traction. Where previously WiFi was the primary enabler to position a mobile device indoors, its inability to calculate elevation, coupled with errors introduced through signal noise, has meant that using WiFi alone indoors was frequently not accurate enough. However, with BLE (Bluetooth Low Energy) beacons now increasing in number, these can combine with WiFi access points while using the device-embedded MEMS (Micro-electro-mechanical-systems) sensors to provide accurate location indoors.” — Juniper Research’s “Top 10 Tech Trends for 2015” whitepaper.

Prediction #4: Connected cars might grab the headlines, but other IoT devices will prove a lot more useful.
“Autonomous vehicles such as drones and self-driving cars will dominate public perception of the IoT. Less-glamorous connected objects will make the greatest impact on people’s lives — many without them even knowing it.” — Brian Gilmore, an Internet of Things and industrial data expert at Splunk

Prediction #5: You, too, can be a data scientist, no PhD required.
“As data becomes more accessible and analytic tools become easier to use and readily available, data science won’t be limited to those in the technology sector. In 2015, anyone with the right tools can draw powerful insights from data. “We’re not blasting CS degrees but in 2015, data scientists’ skillsets will be vastly different, especially as the ability to code will be less of a job requirement. Data scientists should take a page out of anthropology and understand that qualitative information can also provide answers to questions you didn’t know you had.” — Lukas Biewald, CEO and cofounder of CrowdFlower, a data-mining and crowdsourcing service.

Prediction #6: The Internet of Things will have a big impact on customer service, with consumers expecting more personalized interaction with vendors.
“The Internet of Things changes the entire customer service dynamic; rather than a limited number of customer communication channels, customer experience management (CEM) systems will be able to process live streams of data from fitness wearables, motor vehicles, home appliances, and medical instruments, to name only a few categories of connected devices on the horizon. When collected, correlated, and applied, the data from these devices will coalesce into an unprecedented view of the customer’s needs, resulting in far greater competitive advantage for those who are aware of the possibilities.” — Keith McFarlane, CTO and senior vice president of engineering at LiveOps, a cloud-based customer service provider.

15 APM Predictions for 2015

December 16, 2014
APM Digest
Part 1:
Part 2:

The annual list of Application Performance Management (APM) predictions is the most popular post on APMdigest, viewed by tens of thousands of people in the APM community around the world every year. Industry experts – from analysts and consultants to users and the top vendors – offer thoughtful, insightful, and sometimes controversial predictions on how APM will evolve and impact business in 2015.

Some of the predictions on this year’s list have never been seen on our previous APM Predictions lists. Some predictions foresee total upheaval in APM and related markets. Other predictions are continuations from previous years, maybe a little more predictable but no less important or disruptive.

Overall, the outlook remains strong for APM in general. “The APM sector will continue to thrive in 2015,” predicts Karun Subramanian, an application support consultant who started posting on APMdigest this year. “It is amazing how a number of businesses still have not implemented an APM solution.”

The list of predictions show there is a lot of potential for APM in 2015, not just because many organizations still need to get onboard with APM, but because the technology is advancing so rapidly and expanding in so many directions.

Some predictions will be right on the money, and others may not come true. Many of the predictions overlap, just as concepts such as end user experience, mobile APM and DevOps all overlap in the real world. This list is not intended to be clear cut or definitive, but all of the predictions are interesting and make great reading that will get you thinking about all the exciting possibilities for next year.

The first 5 predictions are posted below. The next 5 will post tomorrow, and the final 5 will post on Thursday, followed by some more in-depth predictions from our bloggers in the following days.

A forecast by the top minds in Application Performance Management today, here are 15 APM Predictions for 2015 – Part 1:


In a time when businesses are literally being re-coded by software, applications have now become the face of the business. In the age of rapid adoption and rapid rejection, enterprises have mere seconds to impress their users. In 2015, Application Performance Management solutions will not be just about the performance of applications or business transactions, but its focus will now move to helping enterprises inspire their users and deliver exceptional user experience in order to earn their loyalty.
Anand Akela
Sr. Director and Head of APM Product Marketing, CA Technologies

Adoption of mobile workspaces to provide anywhere, anytime access to workforce apps will cross the chasm in 2015. As a result, organizations will need to validate expected gains in workforce productivity with a unified approach to End User Experience Management (EUEM) that covers mobile, virtual, and physical devices. While advanced analytics will play a critical role in providing insights into the impact of infrastructure performance in these converged environments, organizations focused on transforming their businesses into proactive enterprises will make EUEM the center of their monitoring strategy in order to effectively measure, manage, and improve workforce productivity.
Mike Marks
Chief Product Evangelist, Aternity

I predict that more enterprises will adopt a strategic, unified approach to application performance and user experience to improve employee productivity and engagement, and to build customer satisfaction and loyalty. Increasingly, C-level executives will recognize the link between assuring consistently superior user experiences and achieving strategic objectives and financial outperformance. This will make a unified performance analytics platform second only to database as the most strategic software in the enterprise. More vendors will have to evolve their offerings toward a framework approach.
Gabe Lowy
Technology Analyst and Founder of Tech-Tonics Advisors

Digital systems that deliver experiences and support digital commerce will become the systems of record providing clear lines from business outcomes to user behaviors to user experience to delivery infrastructure. This likely means that APM systems will begin to adopt more customer experience and analytics capabilities to help drive contextual customer experiences and understand not just what happened but what user experiences led to less desired outcomes and how can we improve.
Ken Godskind
Chief Blogger and Analyst,

The sophistication of inbuilt UEM/RUM capability will evolve in a number of dimensions, in particular: object level data and session based metrics – optimizing business relevance.
Larry Haig
Senior Consultant, Intechnica


Growth of the Borderless Enterprise: Last year, “hybrid cloud” was the shiny new buzzword. Today, applications are also incorporating mobile technologies, social media, and Internet of Things (IoT) data into hybrid environments, while continuing to integrate to cloud, partner, provider, and customer application ecosystems. Viewed from the “end to end APM” perspective, the illusion of control has essentially vanished, yet the need for visibility to performance and availability remains. While 2014 has been a year of explosive change, it’s likely that 2015 will see APM vendors and their customers digesting these changes and adapting accordingly.
Julie Craig
Research Director, Application Management, Enterprise Management Associates (EMA)

In 2015, the ubiquitous nature of the cloud (especially SaaS application delivery), user mobility and wireless access will continue to usher in the age of the borderless enterprise. IT will encounter difficulties ensuring the end-user experience and efficiency of its workforce, obligating IT teams to re-think their performance management strategies in light of the expanded domain. Key considerations will include the ability to measure end-user experience regardless of location and establishing standard operating procedures for the implementation of new technologies and applications. IT teams will need to have full control of applications on the network, including the ability to evaluate service level agreements with SaaS providers. IT tool vendors will also need to reconcile the features they provide within the borderless enterprise by adjusting their APM and AANPM product line-up in order to fill this new IT visibility gap.
Bruce Kosbab
CTO, Fluke Networks


In 2014, we saw a significant rise in the adoption of APM as a concept. This year, we also saw a rise of various sub-domains within APM, including data analytics, mobile APM, and DevOps. 2015 will be about consolidating all those sub-domains and thereby meet user expectations by bridging the gap between IT and digital groups within the organization.
Suvish Viswanathan
Manager, Product Marketing & Analyst Relations, ManageEngine

Today, traditional application and infrastructure monitoring, log and event analysis, user response time monitoring, byte-code type instrumentation and other tools are all important in helping IT understand what’s happening with their applications. However, each also only tells a portion of the story, meaning IT is left to try to piece together disparate data to get a holistic view of their application stack, which is no easy task. In 2015, IT will increasingly be able to see across more of these dimensions with an integrated view as vendors bring the capabilities each of these tools provide together for a much improved IT experience.
Michael Thompson
Director, Systems Management Product Marketing, SolarWinds

I predict that APM tools will start to incorporate bandwidth monitoring alongside application performance. As high bandwidth intensive applications become business critical, root cause analysis becomes challenging without the ability to pinpoint the cause of the slowdown with confidence. Monitoring bandwidth and application performance simultaneously provides a more holistic understanding of the ecosystem’s health.
Megan Assarrane
Product Marketing Manager, Ipswitch

APM platforms will have to evolve to support the hybrid enterprise, and provide the end-to-end insight into performance and help IT achieve even faster Mean-Time-To-Resolution. As cloud and mobility are becoming mainstream in 2015, APM has to enable IT to pinpoint issues across the entire stack – from the end user device, through the network to the app or data tiers hosted in the datacenter or in AWS or Azure. In addition to the holy grail of the end-to-end visibility, APM will be driven to deliver high resolution data and analytics to enable faster diagnose and repair cycles.
Peco Karayanev
Sr. Product Manager, Riverbed


Mobile app APM will be the key focus in 2015 as mobile usage continues its growth, with the enterprise space now becoming significant. Closely tied to performance testing is security testing: mobile app security will also rise in importance in 2015.
Michael Azoff
Principal Analyst, Ovum

The market will continue to experience a rise in mobile APM. Today, we barely have any visibility into mobile app performance; it’s a black hole. As businesses see an increase in revenue through their mobile apps, the IT team’s imperative will be to provide a consistent user experience on all interfaces, both web and app. We will see mobile operation teams and Web app operation teams come together to ensure this unified experience while going beyond crash reports for apps.
Suvish Viswanathan
Manager, Product Marketing & Analyst Relations, ManageEngine

Shop Direct’s CEO commented this year that 50% of the company’s consumers viewed its site through a mobile device, but in 2015 100% of their customers will test content through their mobile app. 2015 will be do or die in terms of mobile. Mobile channels are exploding and businesses need to get the mobile experience right this year — not just one time, but on an ongoing basis with the right kind of APM tools. But Mobile APM as a standalone application will no longer exist in 2015. Businesses will realize the importance of reliance on backend infrastructure, which will necessitate end-to-end visibility across all of their applications from one comprehensive APM solution.
Maneesh Joshi
Sr. Director and Head of Product Marketing and Strategy, AppDynamics

We are in the midst of a massive mobile surge that is changing the way customers, employees or partners engage with businesses. Mobile is now a primary touch point and a driving force in the way millions of users bank, shop and transact. As a result of this shift, which will only continue to increase throughout 2015, organizations will need to focus on mobile application and web-delivered experiences. With this in mind, organizations must make optimizing performance, and capturing and analyzing all end user experiences, a top priority both to drive business success and to understand how users interact and engage across digital touch points.
Erwan Paccard
Director of Mobile Performance Strategy, Dynatrace

A “mobile first” mentality will become the focus in APM in 2015. As application developers realize the need to focus on developing apps that work on multi-device, multi-OS and in multi-scenarios, they will look to monitoring from the end user perspective throughout the process. The developers want to create the optimal user experience regardless of the user’s scenario, device or network and will use continuous delivery to effectively drive coverage complexity. This shift will tighten the loop between Dev and Ops with a refocus of APM solutions to adopt cloud-based, real devices and provide insight into the end user experience.
Amir Rozenberg
Director of Product Management, Perfecto Mobile

In 2015, traditional industries that have the most to lose from providing low quality mobile experiences will adopt mobile APM. At the top of the list will be retailers. These companies will follow best practices established by more nimble mobile first and mobile-dominant firms which in 2014 recognized that mobile client code needs to be watched.
Ofer Ronen

The trend for 2015 is clear, flexible work environments! Companies have tremendous interest in creating and empowering a mobile workforce to increase productivity and enhance employee work life balance. The key to success is user adoption and that requires reliable access, optimal app responsiveness and a solution for monitoring end-to-end performance. Only through real-time performance data, intelligent historical analytics and automatic correlation will IT pros have the universal insight and actionable intelligence they need to fine-tune their environment, proactively diagnose potential problems, prevent unscheduled downtime and keep end-users happy and productive.
Srinivas Ramanathan
CEO, eG Innovations

From gleaning the product reviews on IT Central Station, I can share with you one of the top features that real users cite as lacking in current APM tools: mobile app monitoring. I predict vendors will address this critical need in 2015.
Russell Rothstein
Founder and CEO, IT Central Station


The advent of the “Internet of Things” (IoT) will elevate the importance of implementing powerful, easy-to-use and cost-effective APM solutions as a rapidly expanding universe of end-points are connected by software-enabled sensors and systems. The new generation of APM solutions will have to contend with an exponentially greater number of connections, transactions and data points. The APM solutions will also have to span Cloud and on-premise applications which will be linked together in the IoT environment. The task of implementing and administering the APM solutions will increasingly be performed by highly specialized, third-party service providers.
Jeffrey Kaplan
Managing Director of THINKstrategies and Founder of the Cloud Computing Showplace


APM as a monitoring entity is expanding with new sub-categories of technology that complement its demeanor. Some of these technologies will remain on the periphery; however, others will naturally become part of APM as the market is solidified. I foresee the advanced analytics and behavioral learning technologies being incorporated as product offerings from the most advanced APM solutions that are on the market today.
Larry Dragich
Director of Enterprise Application Services at the Auto Club Group and Founder of the APM Strategies Group on LinkedIn.

In 2015, analytics will continue to a be a top APM feature. In 2014, we saw a number of APM solutions bringing out analytics features. This will continue in 2015 as APM increasingly becomes about making forward looking insights by allowing easy querying and presentation of application-centric insights. In addition, in 2015, the primary reason to invest in APM solutions will not be to reduce MTTR. Since the dawn of monitoring and management solutions, their main benefit has been to reduce MTTR. This will change in 2015 as smart IT leaders will realize that good APM and analytics solutions should prevent application issues from happening in the first place. Therefore in 2015, I expect that IT leaders will focus on APM and analytics solutions that can improve metrics such as mean-time-before-failure (MTBF) rather than making MTTR reduction the base necessity.
John Rakowski
Analyst, Infrastructure and Operations, Forrester Research

Analytics will begin serving the needs of others, and the advantages of deep instrumentation will begin to show differences between products which have APM capabilities and those which do not. We will see advances in distributed network analysis, which were previously not handled by today’s cast of characters. Analytics will begin to advance beyond the search and presentation focused offerings of today.
Jonah Kowall
Research VP, IT Operations,Gartner

In 2015 analytics driven APM will mature. In 2014 we saw a growing trend of log analysis usage for better problem diagnosis. In 2015 this trend of search analytics will continue and will become more tightly integrated with APM. The other key area will be self-learning dynamic thresholds to predict problems beforehand rather than detecting them. Another space where analytics will become prominent is optimization of event noise through smarter event correlation. I also think analytics will evolve from detecting and predicting issues to prescribing recommendations and automated actions to resolve application performance problems.
Payal Chakravarty
Sr. Product Manager – APM, IBM

In 2015, real-time analytics will be a “need to have” not a “nice to have” for enterprises that compete based on the strength of their IT services. The companies that will thrive in todayâ’s instant access marketplaces are that can identify problems early and begin resolving them before they have huge adverse effects on customers.
Kevin Conklin
VP of Marketing, Prelert

We love to measure everything in the monitoring world, and 2015 is going to be no different. New advances in technology and the expansion of mobile devices will mean even more data to be collected, and with new data will come an expansion of analytics. We’re already seeing new analysis capabilities in the form of CDF charts and historical comparisons, and I’m anticipating being able to drill down even deeper into data – and integrate it into existing metrics – in order to provide the best possible look at the impact of performance.
Mehdi Daoudi
CEO and Founder, Catchpoint


Big Data has become almost a mainstream word. But, analytics for Big Data, not as much. In 2015 we will start to see the walls between business and IT begin to crumble (or at least further crack) as the business’ needs to rapidly analyze large volumes of data for perishable insights becomes paramount. In order to accomplish this we will need applications that can rapidly stream data for real-time analysis. APM will be used to ensure these newly critical applications perform effectively. In this year we will see the need for real-time, big data analytics drive the importance of APM as the business and IT collaborate to make this work.
Charley Rich
VP Product Management and Marketing, Nastel Technologies

We have seen only the beginning of the M2M wave of big data associated with APM. It’s not only the volume that will go up, but the ways in which it flows. Systems must be able to cope with and normalize this multifaceted data in real time.
Vess Bakalov
Co-Founder and CTO, SevOne

I expect to see a new generation of IT operations analytics tools, based on blended analytics that can more proactively detect anomalies, predict outages, provide deep diagnostics and resolve issues within a real-time business context. By correlating various silo-sourced data (log, performance, configurations, security etc.), the next generation of IT Operations Analytics tools will be better positioned to sift through terabytes of operations data in real time, spotting and presenting issues to users in a more understandable context.
Sasha Gilenson
CEO, Evolven

From gleaning the product reviews on IT Central Station, I can share with you one of the top features that real users cite as lacking in current APM tools: deep analytics. I predict vendors will address this critical need in 2015.
Russell Rothstein
Founder and CEO, IT Central Station

Performance Management in Big Data will become a significant revenue and market capturing opportunity for major APM players in 2015.
Gary Nakamura
CEO, Concurrent


The gap between business analytics and IT analytics is quickly narrowing. In 2015, software analytics and business analytics will be viewed as one in the same and as a critical piece of business intelligence from stakeholders on both sides of the equation.
Maneesh Joshi
Sr. Director and Head of Product Marketing and Strategy, AppDynamics

In 2015, digital marketing analytics solutions will collide with APM analytics features. As APM solutions push upwards into the business with the value they provide then I expect analytics features to encroach on features provided by digital marketing analytics solutions (e.g. Google Analytics). Successful APM solution providers will coalign with digital marketing solution providers through strategic partnerships to shift the focus from just application performance to digital service performance.
John Rakowski
Analyst, Infrastructure and Operations, Forrester Research


In 2015, APM will break out of the hallows of the back-office and onto the CXO’s desk as it transforms a vast array of disconnected service and infrastructure data points into an analytics dashboard that is accessible to both line of business and IT users. This APM analytics dashboard will become a strategic weapon that better supports the business by ensuring business applications are optimized, highly available and accessible to anyone from anywhere.
Bill Berutti
President of the Performance and Availability Business at BMC Software

The focus on customer experience management will drive organizations to undertake end-end monitoring of all of their web, native mobile, mobile enabled web and API assets, using a single platform. These platforms will also necessarily evolve to become more “answer-centric”- with the ability to surface up differing levels of actionable insights and pertinent detail to a diverse group of stakeholders – business owners, IT/Ops personnel, QA engineers and developers.
Denis Goodwin
Director of Product Management, AlertSite by SmartBear

From a functional perspective, competitive pressure will drive an increased focus on accessibility, particularly from those vendors with high end APM solutions.
Larry Haig
Senior Consultant, Intechnica


The APM frenzy will start to expose shortcomings in terms of integrated insights into change management, capacity optimization and broader alignment with business values that will move the discussion closer to Business Service Management (BSM). This was my prediction last year — and I’ve already seen trends in this area, exacerbated by cloud and DevOps, among other things. Maybe 2015 will be the year when the industry finally takes a deep breath and recognizes the need for a new, more dynamic service-aware management system that’s truly cross-silo.
Dennis Drogseth
VP of Research, Enterprise Management Associates (EMA)

My biggest prediction for APM in 2015 is that it needs a name change. As digital experiences become the primary way brands engage with users APM is moving up the business stack. I’ve dubbed this Unified Business Monitoring.
Ken Godskind
Chief Blogger and Analyst,


In 2015, focus on cloud monitoring will continue to rise.
Karun Subramanian
Application Support Expert, is external)

For all of the talk about cloud diversity, the majority of the users in the market have been slow to adopt a cloud diverse development practice. The juggernaut cloud provider is still obviously AWS, as they reportedly hold 17 times the market share of their next 5 competitors put together. And when we talk to our users, their applications in the cloud are primarily running on AWS. In 2015 we should see true cloud diversity take hold, forcing many APM providers, who have prioritized AWS in development, to get on board with universal coverage for cloud providers. Those who have taken a “cloud-first” approach to developing their APM solution will find this transition much easier than those who have to re-engineer or cobble together a mixture of legacy and next-gen solutions that can span multiple physical and virtual environments.
Josh Stephens
VP of Product Strategy, Idera

In 2015, we expect APM will increasingly be focused on cloud performance management (CPM), as applications become decoupled, and components are distributed across public, private and hybrid could environments. Increased visibility into log-level analytics will be critical to APM as access to code and application metrics becomes increasingly untenable.
Andrew Burton
CEO, Logentries

In 2015, APM will expand to cover the growing ecosystem of SaaS applications that increasingly power modern organizations. Traditional APM has covered apps such as web apps and on-premise data stores, but as businesses continue to move to the cloud, APM will have to cover the intersection of applications built by the business and applications bought by the business. Distributed applications communicating with each other is increasingly the fabric of modern businesses, and 2015 is the year that APM steps up to monitoring the entire ecosystem.
Dan Kuebrich
Product Director, Application Performance, AppNeta

There are a lot of apps being developed and hosted in the cloud, and those “producers” need to monitor and manage their own app performance, but what about the customers – the enterprise IT and business operations teams purchasing and consuming these cloud apps and services; e.g., Office365,, Google Apps, Workday, DropBox, Expensify, etc.? There are a lot more of these SaaS app “consumers” than there are “producers” – and these consumers still own application performance management and still support users who expect them to maintain high application service levels regardless of where the app runs. It’s still Application Performance Management, but the requirements are fundamentally different and it’s this emerging need that will disrupt and reshape the APM landscape the most in 2015.
Patrick Carey
VP Product Management and Marketing, Exoprise


In 2015, the migration to 10G networks and the increasing adoption of virtualization will intensify the pressure on APM vendors. The massive amount of data to be analyzed will challenge the industry to combine deep transaction analysis with full details retention at L7 over millions of transactions. In addition, vendors not in a position to monitor transactions over virtual networks will be out the game.
Managing Director, SecurActive Performance Vision

2015 will see the need for Application Performance Management and Application Aware-NPM (AA-NPM) production tools to comprehend two new domains: what is going on inside virtualized servers (virtual machines [VMs] and virtual switches); and visualizing virtual networks, especially around OpenStack. Both of these technologies can have a large impact on production application performance and quality of service. By providing visibility between configuration changes in virtual servers and virtual networks with application performance changes, AA-NPM production tools will simplify the IT staff’s job understanding both unexpected events, as well as seeing if changes in the underlying infrastructure produced the results that were expected.
Mike Heumann
VP, Product Marketing and Alliances, Emulex

Seismic shifts are taking place in the enterprise, and those shifts mean that application and infrastructure performance management (IPM) must adapt to new realities. There’s no stopping the rampant adoption of mobility, cloud and hybrid cloud computing. The growth in virtualization of compute, storage and networking remains unfettered. And, adoption of web-scale computing is burgeoning and so is the DevOps organization. All of this means that the sophistication of performance management tools must evolve at hyper-speed. To keep pace, those tools will also have to be highly scalable. They must extend end-to-end, from the user to the backend infrastructure to include clients, servers, network and storage, as well as support virtualized workloads of all kinds, wherever they reside whether web, e-commerce or other business apps. Yes, that’s a tall order – but, not optional. The APM solutions of 2015 must become infrastructure-aware and the virtualization/IPM solutions must become application-aware!
S. “Sundi” Sundaresh
CEO, Xangati


In the next year, we’ll see companies spend more time and money looking at how they can optimize application performance from the ground up utilizing containerization technology, such as Docker. This trend will be evident across many application and technology types — from web applications to big data analysis engines.
Charlie Key
Founder of Modulus, a Progress company

As Docker continues to gain momentum in organizations adopting DevOps and cloud computing, APM will focus more on container-driven, microservices architectures in 2015. This shift away from monolithic to microservice applications will mean an even greater need for visibility into complex, distributed environments. As a result, APM will evolve to provide even richer data coupled with more powerful analytic capabilities.
Christine Sotelo
Product Marketing Manager, New Relic

Virtualization of servers, networks, and the abstraction of the entire resource infrastructure will challenge APM solutions to maintain operational visibility, reduce troubleshooting time and offer insight into how to optimize IT services. Our prediction for 2015 is that enterprises will ramp their orchestration efforts to achieve enhanced service delivery performance and business efficiencies. Service orchestration will enhance agility to incorporate dynamic application rollouts and the capability to deploy hybrid infrastructure architectures.
Brad Reinboldt
Sr. Product Manager, Network Instruments/JDSU

In 2015 container virtualization will provide the #1 solution for unlocking the promise of big apps. In 2015, containerization will move beyond just Linux (i.e., Docker), into the Windows world. Once there, Windows-based containerization will provide its users with a number of important benefits, such as the ability to dramatically increase application performance and mobility, simplify day-to-day management tasks – such as patch management and asset utilization optimization, ensure high availability (HA) and the operational integrity of the business, and consequently also deliver significant economic benefit across the entire enterprise.
Don Boxley
CEO and Co-Founder, DH2i

A big factor in the upcoming year will be the growth of SDN. As the infrastructure becomes application aware we will see a lot of value being derived from understanding and correlating the performance of applications with the underlying virtualization, server and network infrastructure. Data is only as good as the decisions it allows us to make, and with the flexibility inherent in SDN, we have a lot more options in how we scale and deliver our applications. In order to do effective APM, we must have a holistic view across the whole delivery stack.
Vess Bakalov
Co-Founder and CTO, SevOne


2015 will bring about a great divide within APM and its subcategories. While code-level APM will continue to increase adoption inside application development, a newer category described by leading analysts as Application Operations Management (AppOps) and Application-Aware Infrastructure Performance Monitoring (AA-IPM) will emerge due to the growing demand for visibility from those responsible for the shared infrastructure across the enterprise.
David Roth
CEO and Co-Founder, AppFirst

2015 will mark a significant shift in the way that APM tools are used by IT Operations teams. Driven by increased implementation of Hybrid-Cloud based applications and massively distributed applications, these teams will stop using APM tools as their go-to primary tool, opting for unified infrastructuure/application monitoring solutions instead. The APM tools will move into an integral code debugging solution for Developer-intensive DevOps processes.
Vic Nyman
COO and Co-Founder, BlueStripe


APM integration into the entire software development life cycle will be the standard for enterprises that want to stay agile. Development, testing and monitoring will be integrated at the core so that all three of these processes and supporting systems will work seamlessly together. In 2015, the software industry will have understood the benefits of development and operations working closely together as the DevOps movement continues to take hold. Testing will be an integral part of the mix so that all APM solutions will integrate with Continuous Integration, Continuous Delivery and Continuous Testing solutions.
Alon Girmonsky
Founder and CEO, BlazeMeter

In 2015, APM tools will evolve to enable a better DevOps culture. Integration of APM tools with deployment tools, visualization of pre-deploy and post-deploy performance patterns and automated actions on deployment in response to performance degradation – will be key enhancements in APM tools to aid the DevOps culture. Code level diagnostics in development as well as production environments in the enterprise will become common place. Collaborative problem solving using virtual war rooms will also gain ground to help Developers, Operations and other parties to work smoothly through problem diagnosis.
Payal Chakravarty
Sr. Product Manager – APM, IBM

2015 will be the year that the DevOps tool conversation expands beyond its current (almost singular) focus on configuration automation tools like Puppet and Chef to embrace the fact that collaboration across teams and tools is equally critical to DevOps transformation. As enterprise DevOps efforts expand beyond pilot projects with teams located in the same physical office, organizations will find that SharePoints, emails, conference calls, and instant messaging don’t scale and aren’t effective to aligning distributed teams and tools to support the flow that DevOps is intended to enable. Collaboration capabilities will be increasingly added to existing DevOps-oriented software products. Solutions that enable collaboration across development, project management, and IT operations tools and teams will be sought out and adopted by the many organizations who will struggle with the “uber change” and “uber collaboration” imperative that DevOps represents.
Matthew Selheimer
SVP of Marketing, ITinvolve

Dev teams are finding ITOA invaluable to quickly determine whether problems are due to their code or to something else, e.g. the cloud infrastructure. In 2015, ITOA is predicted to become even more correlative, in terms of not only correlating according all performance and availability data across the IT stack, but relating it with change management data (e.g. from automated code deployment and release management tools), as well. This is going to be critically useful as the large majority of performance and availability issues are caused by changes. This added correlation will also predict the increased deployment of such ITOA tools into the pre-production/QA stage so they can also find potential problems earlier. It is predicted that this “merging” of ITOA tools from pre-production/QA to production uses will become more common.
Phil Tee
Chairman, CEO and Co-Founder, Moogsoft

High profile application performance issues in 2014 – such as – drove a scramble to understand application performance and institute discipline around DevOps. In 2015 these disciplines are going to become an ante – they will be part of every major project’s stage gate for release. This doesn’t mean that we will settle on standards or that all rollouts will take equal advantage of the tools available – but CIOs and business teams will insist on having performance metrics as part of the go/no-go decision matrix. This is the start of moving the basis of the IT conversation away from availability and towards performance – which ultimately will lead to better results for our customers.
Mark Swanholm
Chief Strategy Officer, Performance Tuning Corporation


Banks will upgrade their APM capability in response to an increasing focus on application availability by financial regulators. The provision of online banking has long since moved from a nice-to-have to a service level expectation. In Europe there have been fines in 2014 for banking application down time and in other parts of the world expectations for application availability are being set in regulatory stone, for example, the new MAS TRM guidelines (Monetary Authority of Singapore Technology Risk Management).
Bob Tarzey
Analyst and Director, Quocirca

Big Data: Concurrent CEO Suggests a Pragmatic View for Data Projects in 2015

Dick Weisinger

Where will the Big Data industry be going in 2015? Gary Nakamura, CEO of Concurrent, has published his list of predictions about the direction that Big Data will take. This is the second year in a row that Nakamura has offered up predictions.

Nakamura said that “this year every company is in the business of data, and this will drive the demand for cost effective and scalable Big Data platforms higher than ever before. As the market continues to catch up to the hype, 2015 will be the year that Hadoop becomes a worldwide phenomenon. As part of this, expect to see more Hadoop-related acquisitions, IPOs and the rise of new jobs.”

Nakamura’s predictions made a year ago today for Big Data in 2014 were as follows:

  1. Expect to see more funding of Big Data companies and potentially a significant IPO
  2. More Hadoop projects will fail than will succeed
  3. Big Data Projects will be an increasingly important part of business processes, leading to a need for Big Data project managers
  4. Big Data will become more about the apps that use big data than the data itself
  5. Big Data will be everywhere, but will continue to be convoluted and confusing.

This year, Nakamura again has predictions.  Going into 2015, Nakamura expects to see the following trends in the new year.

  1. Given the number of Big Data failures in 2015, companies will be more pragmatic in matching Big Data to the right problems
  2. The use of MapReduce, because people have become familiar with it, will continue to outnumber other Big Data options, even newer ones like Apache Spark and Tez
  3. Increasingly, Java Enterprise developers will see their skillset in high demand to work on data projects
  4. Hadoop adopters will be looking closely in 2015 to measure their success and return on investment of their projects
  5. “Elephants will fly” — Hadoop will make a push to become a worldwide phenomenon

2015 Predictions Reveal a More Pragmatic Approach to Big Data

Predicted Trends include More Jobs, Failure, Money and Confusion, Driving a More Realistic Approach to Big Data and Worldwide Expansion for Hadoop

SAN FRANCISCO – Dec. 11, 2014Concurrent, Inc., the leader in data application infrastructure, today released 2015 predictions for big data and Apache Hadoop. While the past year brought expansion and education across the Hadoop ecosystem and increased investment in related initiatives, 2015 will be the year of pragmatism as global enterprises turn their focus on getting value from these investments.

“This year proved that every company is in the business of data, building the demand for cost-effective and scalable big data platforms,” said Gary Nakamura, CEO of Concurrent, Inc. “As the market continues to catch up to the hype, 2015 will be the year that Hadoop becomes a worldwide phenomenon. As part of this, expect to see more Hadoop-related acquisitions, IPOs and the rise of new jobs.”

Nakamura offered the following 2015 predictions:

  1. A more pragmatic approach to big data will prevail.

    The big data gold rush has been fast and frenzied, leading to more failures and half-baked projects than success. In 2015, enterprises will start to focus on specific problems and finding the right solution. More companies will realize that it is better to invest in proven technologies for specific problems, as there is no one-size-fits-all approach when it comes to data initiatives.

  2. Good old-fashioned MapReduce will dominate production.

    Apache Spark and Apache Tez will be ubiquitous, but traditional MapReduce will dominate production deployments in 2015. Pragmatic enterprises in the business of data will focus on execution, reliability and mitigating risk on the path to success. Enterprises will gravitate toward what is proven and reliable to solve their big data challenges and not necessarily the most popular or state-of-the-art tech on the market.

  3. Next year will see the resurrection and rise of the enterprise Java developer.

    As developer tools have matured greatly in the past few years, Java is still firmly planted in enterprise IT shops. Most organizations have a deep enterprise developer bench, and in 2015, we will witness a rise in the importance of the enterprise Java developer. No more super heroes – as clarity in roles and the path to repeatable success becomes more and more evident. Enterprise java developers have the experience and understanding of what it takes to build robust, reliable and maintainable applications. This skillset will become critical for long-term success and sustainability of your enterprise data strategy.

  4. The big data “thang” will continue to remain convoluted.

    Sorry to report again this year that the ever-growing big data market will continue to confuse, rather than provide a clear-cut path or guidance on what it is exactly that enterprises should be doing. Those businesses that can separate the wheat from the chaff, and execute upon this valuable knowledge, will be the ones to succeed.

  5. 2015 will be a “show me the money” year when it comes to Hadoop.

    Hadoop has been rapidly adopted as “the way” to execute any go-forward data strategy. However, early adopters must now show return on investment, whether its migrating workloads from legacy systems or new data applications  Luckily, products and tools are evolving to keep pace with the trajectory of Hadoop.

  6. Elephants will fly – Hadoop will make a push to a worldwide phenomenon.

    Hundreds of thousands of data points reported from the Cascading ecosystem support the notion that Hadoop is rapidly spreading across Europe and Asia and soon in other parts of the world. Therefore, there will be a strong Hadoop adoption next year for enterprises ramping up their data strategy around Hadoop, creating new jobs, and further disrupting the data market worldwide.

Tweet This: Gary Nakamura @Concurrent #CEO reveals 2015 #bigdata #Hadoop predictions

Supporting Resources

About Concurrent, Inc.

Concurrent, Inc. is the leader in data application infrastructure, delivering products that help enterprises create, deploy, run and manage data applications at scale. The company’s flagship enterprise solution, Driven, was designed to accelerate the development and management of enterprise data applications. Concurrent is the team behind Cascading, the most widely deployed technology for data applications with more than 200,000 user downloads a month. Used by thousands of businesses including eBay, Etsy, The Climate Corp and Twitter, Cascading is the de facto standard in open source application infrastructure technology. Concurrent is headquartered in San Francisco and online at

Stripe Open Sources Tools For Apache Hadoop

December 9, 2014
Alex Giamas

Stripe, the internet payments infrastructure company recently announced open sourcing a set of internally developed tools based on Apache Hadoop.

Timberlake is a dashboard for Hadoop jobs. Written in Go with a React.js frontend it improves on existing Hadoop job trackers. By providing waterfall and boxplot visualizations for jobs, one can figure out easier what makes a Map Reduce job slow. Timberlake plays well with Scalding and Cascading and can visualize their flows. Timberlake only works with the YARN Resource Manager API and has been tested on v.2.4.x and v2.5.x .

Brushfire is a framework for distributed supervised learning of decision tree ensemble models in Scala. Based on Google’s PLANET, it’s built on top of Hadoop and Scalding. Brushfire can process classification tree learning algorithms in a scalable way, using commodity hardware. Brushfire can build and validate random forests from large sized training data.

Sequins is a dead-simple static database. It indexes and serves SequenceFiles over HTTP, so it’s perfect for serving data created with Hadoop. It’s a simple way to provide low-latency access to key/value entries generated by Hadoop jobs.

Finally, Herringbone is a suite of tools for working with parquet files on hdfs, and with Cloudera Impala and Apache Hive. Stripe uses extensively Apache Parquet for efficient columnar storage. Stripe uses Cloudera Impala with Parquet, and Herringbone is essentially a set of command line interface tools for more productive development.

With Apache Hadoop 2.6 just being released and several big technology companies either contributing to Hadoop development or open sourcing tools from their internal development stack, future looks bright for Apache Hadoop.