On any given day, people around the world drink milk, eat butter and cheese or buy a product with a specialist ingredient produced by the New Zealand multinational co-operative Fonterra.
“We reach a billion people each day,” says Nigel Adler, general manager of global design.
Owned by around 10,500 farmers, Fonterra is the world’s largest dairy exporter and its products are sold in more than 140 countries.
“We have a wealth of data, around 900,000 sensors that provide us with information roughly every two milliseconds,” Adler says.“This means every two milliseconds we get a reading from one of the sensors, such as milk flow and temperature, in the manufacturing sites.
We have data that sits all the way through from our tankers, product development, and containers, he says.
“Then, at the other extreme, we’ve got social media sentiment analysis.”
Adler, who has has held a range of roles from programme manager to network engineer at the dairy giant, is a member of its IS leadership team. He is also the sponsor of the data programme which includes Fonterra’s advanced analytics platform, data quality management, master data management and data governance.
His team has identified significant potential for Fonterra through the application of advanced analytics.
Adler says it asked the question: “How do we bring the talent and capability to get that insight and foresight and help the organisation start making decisions on that data?”
“The flashpoint for us came over a year ago,” he says. “We had a very valuable use case. We knew the data that we needed. We had to find the data, get approval to use it for analysis and create an environment to in which to analyse it.”
The data was easily found as Fonterra had a team of functional experts to locate it, and a way to extract it. The second step was getting security approval to release it.
“We followed it through and got the sign off. But it took us three months to create a place where we could perform the advanced analytics,” Adler says.
“We looked at the pipeline of work that was coming and we couldn’t spend three months every time we wanted to perform data science on a specific set of data. There had to be a better way.
“We knew we had to change our approach to data, with a new operating model and a new architecture.”
He says the more the team looked at the problem, the more it realised the key to success was elastic storage, elastic compute and open technology. It was then it began working with AWS to create the analytics labs.
Adler’s team faced two problems in trying to unlock the data, the first which was how to manage the quality of the data and put it in a technical place where Fonterra could do something with it?
“We focused heavily into data quality and application of business rules and definition,” he says.
The second thing was they needed to make a fundamental change in the way we approached their analytics strategy.
This meant shifting its analytics architecture strategy from a SAP Hana-based architecture to an open cloud-based analytics platform, he says.
“We changed the operating model so we could very quickly provision an environment, and put data into it so our data scientists could do something with it. As a result, we have a ‘gravity-based’ strategy.
“Now that we have this wealth of information coming through, how do we start bringing it all together into one platform to form one set of data rather than trying to have one set here, and another set here?”
Adler says Fonterra still supports platform-based analytics such as SAP or Salesforce, but when it needs to bring data together, it uses the (AWS) data hub: “Without that, we would not have been able to unlock the data.”
The partnership between Fonterra’s Velocity and Innovation team, led by Judith Swales, and the Global Information Services Team (IS), led by Gerben Otter, is key.
Adler says IS is great at delivering the platform and helping manage the quality of the data, while the Velocity and Innovation team delivers the insight and foresight needed for advanced analytics and helps drive it throughout the company.
“That is one of the reasons why we are so successful,” he says.
Beating the innovation curve
Adler can’t stress enough the role played by data in Fonterra’s goal to become more innovative and sustainable.
He says its new data strategy allowed it to test-and-learn and improve speed without going through the cost of full ingestion and refinement.
“Test-and-learn is massive for us,” he says.
“In the traditional data warehousing approach, we go and get the data, put it through a rigorous extract, transform, load [ETL] process and then place it in an environment and let people conduct an experiment.
“You never really know how much of the data you are putting in is actually delivering value to the organisation. We want to prioritise and put our effort into the industrialised ingestion of data, in other words extracting the data and putting it into the data lake to do the prioritisation.
“You have to know what is important and what isn’t. You have to find a way where you can test.”
He says that in one of Fonterra’s most valuable use cases, it had more than a thousand features from multiple data sources that were tested. It discovered only 13 of them were valuable so instead of industrialising an ingestion of a thousand features, it only industrialised 13 features.
“That means the use case is now up and running. It is containerised, sitting on a data hub and adding value to Fonterra,” Adler says.
“And now, when the next use case comes along, we prioritise our ingestion.That means we can continually create these models.”
“We are using data in exciting ways and experimenting with research and development,” he says. “We use data associated with our products to predict the flavour profiles of the new product.”
A global effort
Elsewhere in the operation, on-farm data is providing better insights on how Fonterra can become more innovative, sustainable and efficient.
Adler says the business is running irrigation trials using moisture pods on the ground to measure soil moisture. With the use of analytics, the data can help with deciding how to use less water, energy and fertiliser.
“We embraced the mindset that data quality and analytics have to be a core business function,” he says.
To this end, Fonterra has a team whose sole focus is data quality. It has also created an analytics community of excellence led by general manager advanced analytics, David Bloch.
“It is like a guild. There are eight data scientists in Velocity and Innovation, working with data scientists across the co-operative, helping to find opportunities to solve problems, train people on the job, build expertise and share what they’ve learnt,” Adler says.
“We have about 300 or 400 objects that are very important to us and we are making sure we have a clear definition of what they are, and have business rules associated with them,” says Adler, of this centre, which sits under his function.
“Having somebody such as a data steward monitoring the business rules, knowing when an exception is raised, is valuable,” he says.
“Where we are getting a lot of data, we can start using machine learning.”
Retrospective and future focus
When asked for insights he can share with industry colleagues, Adler lists four major components:
- Actively managing data quality.
- Having the necessary technology to be able to do advanced analytics through the data hub.
- Creating an advanced analytics-led community of expertise.
- The continual journey around data management that every organisation is doing.
So how does Fonterra keep pace with the rapid changes in technology that are so much part of today’s world?
“What we are doing at the moment is right on the edge so it puts you in a position where you make a decision and within three or four months there is a better way of doing it,” he says.
Adler tells his team, “Look at the distance from where we began to where we currently are, and compare that to the distance to the next best version.
“I would much rather we complete what we decided to do and, once it is done, then look at continuous improvement.”
The distance he speaks of above is the key element in perceiving change, he says.
“We have come a long way, and the next best version is just a small step.
“It is forcing us to rethink a lot of things. For instance, the patterns we have today are built from a datacentre perspective. They are still valid but should we be applying them when we are in an API infrastructure as a code style environment?”
“An incredible future awaits us,” Adler concludes.