Jérémy DahanJérémy leads new technology research for Elektrobit, including managing business development and partnerships in California’s Silicon Valley. Working out of EB’s Silicon Valley Innovation Lab, he scouts new technologies and business opportunities and develops proof-of-concept technologies. Jérémy was formerly a software architect and team lead for EB’s work in navigation data standard technology. He also led the human-machine interface (HMI) team at Altran in France, prototyping a new infotainment HMI for a French car maker. Jérémy holds master’s degrees in engineering and embedded systems from Mines Saint-Etienne, France and CPE Lyon, France.
Data storage costs are so low, so you might as well keep it all. True or false?
When collecting terabytes and petabytes of test drive data, a system gets clogged very quickly. You then have a huge amount of data that needs to be extracted and stored, but you have limited capacity. Pushing all of that data to a hybrid solution sounds like a good idea but leads to a dramatic increase in time management and storage costs. Ultimately, only a very small portion of the data will be relevant and used. We’re then left with this question: how do you determine which data should be kept and indexed so that you can find it quickly? A system leveraging tiered storage with pre-analysis of the data at the earliest stages helps solve this challenge. In addition, leveraging a digital twin with tweaked variances helps reduce the need to collect real-world driving data. Where can we find this solution?