Big Data predictions for 2018: the expert view
2017 was a big year for Big Data with a wider proliferation of new tech and upskilling to meet demand. Data is a resource and companies are beginning to acknowledge the depth of its power.
And it’s all in good time: it’s estimated that 40 zettabytes (40 trillion gigabytes) of data will be created by 2020 – 300 hundred times the amount of data in 2005.
On a day-to-day level, each of us creates data on our phones and computers. Indeed, six billion phones are in circulation and 3.77 billion people use the internet – and that’s just in our regular lives.
The true power of data is in business – in utilising data for practical applications. We talked to our CEO, Simone, to outline the Big Data predictions to look out for in 2018.
1. Growing interest in the practical applications of Big Data
For many companies, Big Data seems impenetrable because of the complexity of associated systems and softwares. The language, too, can be a head-scratcher with talk of multifaceted infrastructure and reams of data.
In the last year, public debate has shifted from Big Data to ‘IoT’ and ‘AI’ – from the general interest in using data to the actual applications of it.
The applications are vast, from industry to industry – from machine learning to predictive maintenance and AI. Forbes predicts that 70 percent of companies expect to implement AI in 2018 – but what does that actually look like?
For Simone, the stat is contentious.
“There are many different understandings of what ‘implementing AI’ actually means,” he points out. “The scenario will be very heterogenous [originating in outside sources]. A company that dedicates efforts to digitalise its supply chain processes, for instance, probably enters in the 70 percent, just like a consumer brand that implements a ready-to-go chatbot in its customer support.
“They both implement AI, but do they really belong to the 70 percent? Trend analysts will need to be careful about studying these changes in order to give realistic expectations to businesses and investors.”
2017, Simone says, was about preparing – essentially tooling up for analytics, where 2018 will see widespread practical applications. Finance, the automotive industry, marketing, telecommunications, factories, and even vineyards have all already seen a massive impact from the practical applications, while many more are set to follow suit.
In fact, France’s president Emmanuel Macron went so far as to call for a Europe-wide strategy on Big Data during his visit to Beijing in January 2018, as he sees it as a priority for the future.
With presidential backing, the practical applications of Big Data have truly entered the mainstream.
2. The emergence of hybrid solutions and self-service analytics
While many new technologies are cloud-based, it isn’t always an option for data storage.
Many organisations see the benefit of moving to the cloud – namely efficiency and moving from legacy solutions – but many don’t have the resources, in which case a hybrid solution is the best option.
According to Gartner, a hybrid cloud service is a cloud computing service that is composed of a combination of private, public and community cloud services, from different service providers.
For example, an organisation may choose to store sensitive client data on their premises on a private cloud application, but interconnect that application to a SaaS-based business intelligence application provided on a public cloud.
Meanwhile, ‘self-service’ data analytics is another trend that’s gathering steam. As Simone says, “many companies are adopting ‘self-service’ data analytics tools as the need for analysis is growing so there’s a desire for employees to learn how to run analysis in autonomy without having to hire a data scientist every time an analysis is needed.”
In marketing, for example, self-service analytics takes the shape of a dashboard that allows marketers to examine, manipulate, and report on the data without the need for a data scientist.
Essentially, it’s making analytics even more accessible by creating intuitive and user-friendly software.
3. Decrease of ETL solutions
ETL (extract, transform, and load) is a process that enables IT departments to extract data from databases and load it into data warehouse or operational data stores for analysis.
Additional data sources mean larger workloads – which, naturally, causes problems in the era of Big Data.
The ETL process is a daily grind for IT departments, which is why we expect to see it phased out in coming years.
Many vendors now offer ‘no-ETL solutions’, which should provide a more streamlined data architecture and flow.
4. Growing interest in cybersecurity related topics
While the safety of the cloud has been a hot-button topic in cybersecurity in recent years, Big Data is something of a double-edged sword.
Big Data has given cybercriminals the possibility to access mass quantities of sensitive data. And with GDPR looming on the horizon, a safety breach could have bigger implications than ever.
The scale of Big Data means that it’s more work for cybersecurity teams: a medium-sized organisation with 20,000 devices will transmit more than 50TB of data a day – which means that over 5GB of data must be analysed every second to detect potential threats.
It’s an astonishing number, but as much as it’s a threat, Big Data also provides opportunities in the space: cybersecurity could benefit widely from the actionable intelligence and predictive systems that Big Data is known for.
5. Predictive maintenance and Industry 4.0 will become a necessity – but the wider world is waiting too
Predictive maintenance is the spiritual successor to the less-efficient preventative maintenance.
Built on a system of machine learning and predicting when a machine will break down, it has been revolutionary to factories (and is one of the hallmarks of the rise of Industry 4.0, which will be powered by smart factories and analytical and cognitive technologies).
Predictive maintenance represents a radical shift towards a more efficient factory model, in which downtimes and delays are a thing of the past.
The stats for predictive maintenance speak for themselves: McKinsey reported that it could reduce the overall cost in factories by 30 percent, while reducing downtime by up to 40 percent.
As much as predictive maintenance is a boon for the industrial sector, some of the most exciting developments are in healthcare as it offers the opportunity to directly affect the masses in a positive manner.
“Scientists are working hard on finding new ways to offer effective cures, efficient healthcare systems, personalised medicines etc. and this is amazing,” Simone says.
“If we were able to take and analyse all the health records on earth, together with climate and environmental conditions, shouldn’t we be able to predict – and fight back – against epidemics before they show up?”
It’s an affecting thought – and one that may well become an actuality thanks to the proficiencies and evolution of the tech.
“Already, current technology enables us to do things that weren’t possible just five years ago,” Simone says. “However, many companies should leave behind the mindset of ‘as good as it gets’ – and start taking Big Data as the precious resource it is.
“It’s no longer a matter of ‘how do I extract Big Data from my processes?’ It’s a matter of ‘what kind of data do I need to grow my business, to create new business – to reach 2020, 2025, 2030? If the last few years were about ‘gathering supplies’, then now is the time to understand and utilise them.”
Are you interested in revolutionising your organisation with Big Data?
If you want to harness the power of Big Data in your business, Statwolf’s data science service can help, with advanced online data visualisation and analysis simply running in your web browser.
We offer a range of custom services to suit your needs: advanced data analysis and modelling, custom algorithm creation, and fraud analysis.
Want to make sense of your data? Download our comprehensive guide: The Predictive Maintenance Cookbook.