Self-driving vehicles.

The Rocketry Forum

Help Support The Rocketry Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
It's not just about wallabies and kangaroos!
I looked at the 3 articles which followed in post, the ieee one addressed that after commenting on some Kangaroos being out loose here in the US,

Secondly, kangaroos are, everywhere but in Australia, a good example of autonomous cars' long tail problem (heh), which suggests that as the number of possible driving situations you may encounter approaches infinity, the probability of encountering those situations approaches zero. In other words, lots of weird stuff could potentially happen, and there's no way to predict all of it. If you're driving around in the United States, encountering a kangaroo would be very weird, but it's by no means impossible: here's one that was running around eastern Oklahoma in December of 2013:

I’m sure kangaroos are farther down the list, but it’s a list that's infinitely long, and there’s no clear point at which a potential situation goes from “worth preparing for” to "not worth the effort.”

“Don't hit things” is useful, practical advice that tends to be pretty high up on an autonomous car's generalized priority list. However, there are many different ways of not hitting a thing depending on what that thing is, and sometimes that advice can even cause problems if the car senses something that it doesn't understand.
 
Kangaroos are actually a pretty interesting problem. While other animals travel on a 1D line, mostly in 1 direction, kangaroos apparently take up the whole 3D space with a 2D random walk 😆.

 
So, if you're riding in a self-driving vehicle, how will you spend your time while presumably the software is driving?
 
So, if you're riding in a self-driving vehicle, how will you spend your time while presumably the software is driving?
Speaking for myself more time risking motion sickness. If I’m not actively driving and instead doing other things as a passenger motion sickness quickly becomes a rather acute problem.
 
I've been in & out on this conversation.. but I do have thought..

I feel we are kind of expecting it to be all or nothing; that our next car should be fully automated to drive from our driveway to the parking spot in front of the restaurant, with no intervention from us (except to open the door to get in!) I feel we are missing some of the incremental steps to get there..

The main thought I have is to automate just the highway driving portion first. Let's get that under control & us used to that aspect. then move onto city driving.

I feel highway driving has some advantages, as we can drive onto the highway, then 'latch onto' a car column already en route. kinda like cruise control. I believe California tried something like this, with magnets in the roadway for guidance. One automated car becomes the leader, and others slot in behind them.. you've programmed where you want to go, so when your exit appears, the car will disengage [with your acknowledgment!] and you then assume control for the remainder of your drive.

Start wit the major highways then to the smaller routes.

Now, this would only be between major city centers, not thru cities where routing traffic jams are common. But this could help 'ease us' into being comfortable in fully automated cars.

Much more can be discussed with option / concept..
 
So, if you're riding in a self-driving vehicle, how will you spend your time while presumably the software is driving?
Me? I'll be watching videos about self-driving vehicles of course. 😁
(And to make the most of it, I'll be facing backwards)

I feel we are kind of expecting it to be all or nothing; that our next car should be fully automated to drive from our driveway to the parking spot in front of the restaurant, with no intervention from us (except to open the door to get in!) I feel we are missing some of the incremental steps to get there..
I agree. Incremental steps, and also incremental users. And any one of those steps wil be offered to luxury cars first, for a premium. I think it will always be a consumer choice, but many or most people will end up choosing it (as costs become lower).

The main thought I have is to automate just the highway driving portion first...
Supervised FSD is already going anywhere, but as far as I know (not that I should know), driverless robotaxis are presently only approved in a certain areas of San Francisco.

https://www.getcruise.com
 
Last edited:
Look at the Ford Maverick -- it's based on a car chassis, but it's a 4-door pick up truck. Starts new at $20k. Dunno if they make a rubber interior, but it looks like Ford is going in the direction you're looking for -- basic, simple, cheap.
I sent Ford a long letter last year on the New Ford Ranger. Discussing what I basically said above, but pointing out the New Rangers Lack for the typical everyday working man or farmer and how all that un-needed junk is something to go wrong and will cost a fortune to fix. Also telling them how I bought a new Ranger in 98 and I still drive it with 375,000 milea (at that time) on it and never did anything major to it. (Now 400,000+ miles) And you can't beat a manual transmission for longevity!
So...maybe they listened? I don't know how far they will take it, but the Maverick does look like something I would be interested in. Not that I can afford a new truck, but I've considered a lease due to the low miles I drive now. But it still have to be a manual tranny. Me and auto's just don't get along.
Thanks for pointing it out!
 
Speaking for myself more time risking motion sickness. If I’m not actively driving and instead doing other things as a passenger motion sickness quickly becomes a rather acute problem.

This happens to me. I find it helps to open the window a bit, and stick my hand out in it. I don't know why it helps, but it does.

I read about this, and one suggestion I found was to look straight ahead and NOT look around to the sides/rear. Seems to help a bit.
 
To get a feel for the the state of the art, I want to closely follow Tesla's release of FSD beta version 10.69, which is supposed to be have important updates. A customer who'll be testing it:



Avoiding an opening car door:

 
Last edited:
Here’s I was waiting for, a list of updates in Tesla’s most recent upgrade to its Full-Self Driving software. It gives an idea of just how difficult the problem of vehicle automation is and how much work is involved:

FSD Beta v10.69 Release Notes

- Added a new "deep lane guidance" module to the Vector Lanes neural network which fuses features extracted from the video streams with coarse map data, i.e. lane counts and lane connectivities. This architecture achieves a 44% lower error rate on lane topology compared to the previous model, enabling smoother control before lanes and their connectivities becomes visually apparent. This provides a way to make every Autopilot drive as good as someone driving their own commute, yet in a sufficiently general way that adapts for road changes.

- Improved overall driving smoothness, without sacrificing latency, through better modeling of system and actuation latency in trajectory planning. Trajectory planner now independently accounts for latency from steering commands to actual steering actuation, as well as acceleration and brake commands to actuation. This results in a trajectory that is a more accurate model of how the vehicle would drive. This allows better downstream controller tracking and smoothness while also allowing a more accurate response during harsh maneuvers.

- Improved unprotected left turns with more appropriate speed profile when approaching and exiting median crossover regions, in the presence of high speed cross traffic ("Chuck Cook style" unprotected left turns). This was done by allowing optimisable initial jerk, to mimic the harsh pedal press by a human, when required to go in front of high speed objects. Also improved lateral profile approaching such safety regions to allow for better pose that aligns well for exiting the region. Finally, improved interaction with objects that are entering or waiting inside the median crossover region with better modeling of their future intent.

- Added control for arbitrary low-speed moving volumes from Occupancy Network. This also enables finer control for more precise object shapes that cannot be easily represented by a cuboid primitive. This required predicting velocity at every 3D voxel. We may now control for slow-moving UFOs.

- Upgraded Occupancy Network to use video instead of images from single time step. This temporal context allows the network to be robust to temporary occlusions and enables prediction of occupancy flow. Also, improved ground truth with semantics-driven outlier rejection, hard example mining, and increasing the dataset size by 2.4x.

- Upgraded to a new two-stage architecture to produce object kinematics (e.g. velocity, acceleration, yaw rate) where network compute is allocated O(objects) instead of O(space). This improved velocity estimates for far away crossing vehicles by 20%, while using one tenth of the compute.

- Increased smoothness for protected right turns by improving the association of traffic lights with slip lanes vs yield signs with slip lanes. This reduces false slowdowns when there are no relevant objects present and also improves yielding position when they are present.

- Reduced false slowdowns near crosswalks. This was done with improved understanding of pedestrian and bicyclist intent based on their motion.

- Improved geometry error of ego-relevant lanes by 34% and crossing lanes by 21% with a full Vector Lanes neural network update. Information bottlenecks in the network architecture were eliminated by increasing the size of the per-camera feature extractors, video modules, internals of the autoregressive decoder, and by adding a hard attention mechanism which greatly improved the fine position of lanes.

- Made speed profile more comfortable when creeping for visibility, to allow for smoother stops when protecting for potentially occluded objects.

- Improved recall of animals by 34% by doubling the size of the auto-labeled training set.

- Enabled creeping for visibility at any intersection where objects might cross ego's path, regardless of presence of traffic controls.

- Improved accuracy of stopping position in critical scenarios with crossing objects, by allowing dynamic resolution in trajectory optimization to focus more on areas where finer control is essential.

- Increased recall of forking lanes by 36% by having topological tokens participate in the attention operations of the autoregressive decoder and by increasing the loss applied to fork tokens during training.

- Improved velocity error for pedestrians and bicyclists by 17%, especially when ego is making a turn, by improving the onboard trajectory estimation used as input to the neural network.

- Improved recall of object detection, eliminating 26% of missing detections for far away crossing vehicles by tuning the loss function used during training and improving label quality.

- Improved object future path prediction in scenarios with high yaw rate by incorporating yaw rate and lateral motion into the likelihood estimation. This helps with objects turning into or away from ego's lane, especially in intersections or cut-in scenarios.

- Improved speed when entering highway by better handling of upcoming map speed changes, which increases the confidence of merging onto the highway.

- Reduced latency when starting from a stop by accounting for lead vehicle jerk.

- Enabled faster identification of red light runners by evaluating their current kinematic state against their expected braking profile.

Press the "Video Record" button on the top bar UI to share your feedback. When pressed, your vehicle's external cameras will share a short VIN-associated Autopilot Snapshot with the Tesla engineering team to help make improvements to FSD. You will not be able to view the clip.

Full Self-Driving (Beta) Suspension

For maximum safety and accountability, use of Full Self-Driving (Beta) will be suspended if improper usage is detected. Improper usage is when you, or another driver of your vehicle, receive five 'Forced Autopilot Disengagements'. A disengagement is when the Autopilot system disengages for the remainder of a trip after the driver receives several audio and visual warnings for inattentiveness. Driver-initiated disengagements do not count as improper usage and are expected from the driver. Keep your hands on the wheel and remain attentive at all times. Use of any hand-held devices while using Autopilot is not allowed.

Driver Profiles

Media Player accounts (e.g. Spotify login) are now linked to your driver profile. Simply log into your media account while your driver profile is selected

Regenerative Braking

Your vehicle can now automatically apply regular brakes for consistent deceleration when regenerative braking is limited due to battery temperature or state of charge. To enable, tap Controls > Pedals & Steering > Apply Brakes When Regenerative Braking Is Limited.

Note: Tesla appears to be rolling out this feature slowly to select vehicles.

Automatic Supercharger Rerouting

If you're navigating to a Supercharger and it suddenly becomes more congested before you arrive, Tesla will now calculate whether there are any nearby Supercharger that may be less congested.

If Tesla believes that it can reduce your total travel time by navigating to a less congested charger, it will reroute you to a Supercharger that's less busy.

Navigation Energy Prediction

Energy prediction for your route has been improved by incorporating forecasted crosswind, headwind, humidity and ambient temperature when using online navigation.

Regeneration / Acceleration Line

The line directly above the speedometer reading in a Model 3 and Model Y shows the amount of regenerative braking (green) or acceleration (black) that is occuring. The center of the line is neutral where there is no acceleration or regenerative braking occuring.

The further the line grows to the left, the greater the amount of regenerative braking is taking place, and the more it goes to the right, the greater the acceleration.

With this update the regeneration line will now also show when physical brakes are being applied. The amount of physical brakes being used will appear as a gray line after the green, regen line.

The physical brake line is only show when the vehicle is in Autopilot.

The regen/acceleration line is now also thicker, making it easier to see.

Heat Pump & Low Voltage Battery

You can now view additional information about your car by tapping Controls > Software > Additional vehicle information.

The list of information will now include the type of low-voltage battery installed and whether your vehicle has a heat-pump.

Powered Trunk

If your vehicle is equipped with a powered trunk, this update addresses an issue that could have caused your trunk from closing completely.

Improved Vehicle Path

According to new images this beta appears to include an improved vehicle path visualization that shows the vehicle's intention much further out than in previous releases.

https://www.notateslaapp.com/software-updates/version/2022.16.3.10/release-notes
 
Here's a limitation, called "Chuck's Unprotected Left Turn", showing the kind of problem software engineers want to solve:


By comparing the video above with the following one, we have an example of one self-driving feature that’s largely improved with the 10.69 release. (“Unprotected” means no stop sign or traffic lights)

 
Regular driving in a regular town.

 
A short 3-min one:



In a 2020 tweet:

226A04E4-E6BA-497A-8364-30D9CF036A4F.jpeg

I think this is because a car that doesn’t have to park can be a source of income the rest of the day.
 
Last edited:
It's a rendering, but it's so beautiful!



And here's a really good narrator for FSD:

 
Not sure how this really relates to this thread.

Last weekend my wife and I drove our 1979 MGB and 2021 Volvo XC60 on a 3.5 hour each way trip, mostly highway. She drove the MGB back and I followed in the Volvo.
The Volvo has autopilot which I used. We briefed that she would drive at 65MPH.

I set the cruise speed at 70MPH and turned on the autopilot behind her.

I have to say I was impressed with the performance of the system. The Volvo varied the gap based on speed, matched her every speed change perfectly, handled the other cars very smoothly and seemed to anticipate the movement of the cars around us, even handeling several cars that cut in between us.

It also handled road construction and lane constrictions with full side barriers very well. It brought the car to a complete stop several times and accelerated again.

The only thing I had to do was keep a hand on the steering wheel, it uses a weight sensor to make sure a hand stays on the wheel. If I went 30 sec without a hand on the wheel it would chime at me and eventually disconnect.

Personally I would rather see the car use eye tracking cameras instead of a weight on the steering wheel sensor. and a disconnect at speed is somewhat unsafe.

But overall I was very impressed with the system.
 
Not sure how this really relates to this thread.

Last weekend my wife and I drove our 1979 MGB and 2021 Volvo XC60 on a 3.5 hour each way trip, mostly highway. She drove the MGB back and I followed in the Volvo.
The Volvo has autopilot which I used. We briefed that she would drive at 65MPH.

I set the cruise speed at 70MPH and turned on the autopilot behind her.

I have to say I was impressed with the performance of the system. The Volvo varied the gap based on speed, matched her every speed change perfectly, handled the other cars very smoothly and seemed to anticipate the movement of the cars around us, even handeling several cars that cut in between us.

It also handled road construction and lane constrictions with full side barriers very well. It brought the car to a complete stop several times and accelerated again.

The only thing I had to do was keep a hand on the steering wheel, it uses a weight sensor to make sure a hand stays on the wheel. If I went 30 sec without a hand on the wheel it would chime at me and eventually disconnect.

Personally I would rather see the car use eye tracking cameras instead of a weight on the steering wheel sensor. and a disconnect at speed is somewhat unsafe.

But overall I was very impressed with the system.
Thanks. It would be difficult to experience each autopilot system available today personally, so I find first hand accounts to be interesting. Maybe I'll look for videos about Volvo. I wasn't thinking of them yet.
 
The Volvo system is much more conservative than the Tesla system.

Volvo undersells the system, I feel Tesla oversells their system. Having said that Tesla's
has much more capability built it. The Volvo will not lane change by itself nor is it designed for non highway use so it does not handle crossing traffic.
But as a highway safety device it works excellent.
 
I've been in & out on this conversation.. but I do have thought..

I feel we are kind of expecting it to be all or nothing; that our next car should be fully automated to drive from our driveway to the parking spot in front of the restaurant, with no intervention from us (except to open the door to get in!) I feel we are missing some of the incremental steps to get there..

The main thought I have is to automate just the highway driving portion first. Let's get that under control & us used to that aspect. then move onto city driving.

I feel highway driving has some advantages, as we can drive onto the highway, then 'latch onto' a car column already en route. kinda like cruise control. I believe California tried something like this, with magnets in the roadway for guidance. One automated car becomes the leader, and others slot in behind them.. you've programmed where you want to go, so when your exit appears, the car will disengage [with your acknowledgment!] and you then assume control for the remainder of your drive.

Start wit the major highways then to the smaller routes.

Now, this would only be between major city centers, not thru cities where routing traffic jams are common. But this could help 'ease us' into being comfortable in fully automated cars.

Much more can be discussed with option / concept..
This full automation of travel is discussed in this video:

The video describes how a Resource Based Economy could work, which would eliminate money, and having to work. Total elimination of using fossil fuels, zero traffic accidents, advanced road planning, high speed trains, etc. Fascinating video to watch.
 
This full automation of travel is discussed in this video:

The video describes how a Resource Based Economy could work, which would eliminate money, and having to work. Total elimination of using fossil fuels, zero traffic accidents, advanced road planning, high speed trains, etc. Fascinating video to watch.
At over 90 min, I doubt I'll get to watch, but I suspect there are some good ideas in there. There's a whole spectrum between what's realistic and what's science-fiction and we never quite know far how into the unfamiliar we'll reach. Who doesn't like new stuff.

🚕🤖🚕🤖🚕🤖🚕🤖🚕🤖

Tesla AI day in 10 days.
 
At over 90 min, I doubt I'll get to watch, but I suspect there are some good ideas in there. There's a whole spectrum between what's realistic and what's science-fiction and we never quite know far how into the unfamiliar we'll reach.
this 90 minutes of your life will give you a whole new perspective of what humans could act like and conduct themselves in a compassionate way among other humans. zero pollution, zero automobile deaths. It shows what a decently run planet could look like.
 
At over 90 min, I doubt I'll get to watch
I watch a lot of content on youtube, some for entertainment but most for learning. I watch almost everything at twice speed so theoretically I could watch that video in 45 minutes. Only rarely will I find someone on youtube that I have to slow down to 1.75x or 1.5x in order to easily understand them.
 
I watch a lot of content on youtube, some for entertainment but most for learning. I watch almost everything at twice speed so theoretically I could watch that video in 45 minutes. Only rarely will I find someone on youtube that I have to slow down to 1.75x or 1.5x in order to easily understand them.
Now there's an idea. I usually stick to videos that are less than 5 minutes, and sometimes up to 10 min. I get to cover more topics this way. Evenings and weekends are so short.

🤖 🚗🤖🚗🤖🚗🤖🚗🤖🚗

Here's what I want to try watching later this week:

 
Normal people will be watching short summary videos tomorrow, but just in case, here's the live feed for today:

 
Last edited:
So here's how the show started, and then for 3 hours, they dug ever deeply into the guts and brains (which are similar to the cars').

 
You imagined it but never dared to mention it. Now you see it.

 
Back
Top