Autonomous vehicles not only require massive amounts of data to effectively “see” the world, but also generate literal terabytes of data per hour of usage!
This means that in order for the algorithms to continually learn and improve, there must be an economical way to move that massive amount of data in an efficient way.
While there are companies that are beginning to specialize in this niche, a significant investment in localized data movement must be made for autonomous vehicles to become ubiquitous across the world!
To keep up with the podcast be sure to visit our website at datacouture.org, follow us on twitter @datacouturepod, and on instagram @datacouturepodcast. And, if you’d like to help support future episodes, then consider becoming a patron at patreon.com/datacouture!
Music for the show: Foolish Game / God Don’t Work On Commission by spinmeister (c) copyright 2014 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/spinmeister/46822 Ft: Snowflake
Welcome to data couture the podcast about data culture at work
at home. And on the go. I’m your host, Jordan Bohall. To stay up to date with everything data controller, be sure to like and subscribe down below. Furthermore, be sure to follow us around the internet at data to her pod, on Twitter, at data couture podcast on Instagram, and at data couture pod on Facebook. Of course, if you’d like to help keep the show going, then consider becoming a patron at patreon. com forward slash data couture. Now, no under the show,
Welcome to data couture. I’m your host Jordan. And on today’s data bytes, we’re going to follow up on something I said on Monday, namely the need for lots and lots of internet of things connected devices across all of our roadways across the world in order for autonomous vehicles to actually be able to operate autonomously. However, before we do that, I want to pitch Sure, I guess, plug something that I’m doing this coming week, namely, I will be in Austin for the driven conference sponsored by BP three global there, I’ll be giving a talk on digital transformation on one of the panels, I believe on Wednesday. So if you’re interested in going if you’re in the Austin or Texas area, then I encourage you to show up to the conference and hit me up on social media and I can get you wonderful discount to attend.
Now for this data bytes, we’re going to be talking about data storage and the type of data autonomous vehicles actually produced. Because not only do they need something like the blockchain to deliver them the various maps and other kinds of roadway information to operate effectively if we’re ignoring issues like LIDAR, radar, actual cameras or ultrasonic options. However, those types of pieces on these on these computers, geez on these cars will still be necessary. And so what’s interesting is, of course, that she could probably predict, each one of these cars are producing massive amounts of data, just massive amounts of constantly streaming data. And so companies like YMLIE, Google and Uber and some of the others getting into this autonomous era, this autonomous phase of car manufacturing, they have been reporting about exactly how much data is produced? And can you guess, exactly how much per day per car autonomous vehicles are producing? I’ll wait a second, because I think it’s gonna blow your mind. Honestly. All right, you ready. So the range is between 1.4 terabytes to 19 terabytes per hour
to 19 terabytes per hour.
That means that for given normal drive time, say 2030 minutes per day plus go to the grocery store, that kind of thing, there’s a very solid chance that any given autonomous vehicle will be producing between 11 terabytes, and 152 terabytes per day, per car. And to give you a bit more fine grained example of what that means, or per type of autonomous vehicle vision system, in we’re talking about radar, we’re talking about 15 megabytes per second, forgot about LIDAR, we’re talking upwards of 100 megabytes per second, if they use traditional cameras, that’s between 530 500 megabytes per second, if we’re using the various ultrasonic sensors, that’s, that’s negligible. That’s point oh, one megabytes per second. And less than that even.
The point is, the bandwidth that these cars are going to have to consume are between three gigs, a second, to about 40 gigs, a second, constantly, every single time you drive your car. That means there’s going to have to be massive storage devices on these vehicles. Because not only do we want to make sure that our algorithms are constantly updated, and constantly learning so that the AI the deep learning, actually proceeds actively and learns and becomes more and more sophisticated over time. But there’s a serious question about how is this going to get transmitted? Is this going to go through your home Wi Fi? Is this going to be sent back to the car manufacturer? One presumes whenever you get home, and it just takes up, your entire bandwidth is going to connect via 5g to some sort of cellular, cellular network? And if so, are you going to be responsible for paying that bill for how much data is going to get transmitted each day?
I don’t know. I am very impressed by the levels of data that’s coming out of these vehicles. And I’m very excited to see what they do with the information other than just simply train their algorithms because you know, they’re doing more than just turning algorithms with this data. So if you have thoughts about this, if you know of suggestions for how to work around this problem of data transmission, please, let’s talk about it in the comments. Until next time, have a good weekend. And I’ll see you in Austin.
That’s it for the show. Thank you for listening. And if you liked what you’ve heard, think consider leaving a comment or like down below. Stay up to date on everything data couture, be sure to follow us on Twitter at data couture pod. consider becoming a firstname.lastname@example.org forward slash data couture music for the podcast. It’s called foolish game. God don’t work on commission by the artist spin Meister used under the Creative Commons Attribution 3.0 license, writing, editing and production of the podcast is by your host Jordan Bohall.