Lost your password? Please enter your email address. You will receive a link and will create a new password via email.
Please briefly explain why you feel this question should be reported.
Please briefly explain why you feel this answer should be reported.
Please briefly explain why you feel this user should be reported.
We want to connect self-driving vehicles engineers and specialists,
bring together people of diverse backgrounds and areas of expertise
to empower everyone to share their knowledge.
Lets make self-driving cars a reality together!
If AI is so important in training self-driving cars to drive, why aren’t car manufacturers paying us to put cameras on cars?
Terabytes of videos are not helpful without the sensor and control data that describes where the car was and what it intended to do at he exact moment The driverless cars are rolling computers and sensor suites hoovering up data during their trials.
Terabytes of videos are not helpful without the sensor and control data that describes where the car was and what it intended to do at he exact moment
See lessThe driverless cars are rolling computers and sensor suites hoovering up data during their trials.
Are there any cases when autonomous vehicles are programmed to stop in place and wait?
While driving Tesla Autopilot: It cannot be turned on in weather conditions where it can’t see the road (heavy fog, snow). The car will ask the human driver to take over if self-driving and weather conditions worsen. It will drive on an unknown road, provided it can “see the road.” The car wouldn’tRead more
While driving Tesla Autopilot:
It cannot be turned on in weather conditions where it can’t see the road (heavy fog, snow). The car will ask the human driver to take over if self-driving and weather conditions worsen.
It will drive on an unknown road, provided it can “see the road.” The car wouldn’t knowingly put itself on a road it didn’t know existed.
Experiments show that masking a camera will prevent engagement of self-driving features.
Presently the car (Level 2 self driving) requires a human. If the human is not interacting with the car, it will turn on the emergency flashers and slow to a stop.
See lessOnce the cars are ready for fully-autonomous driving, they will have to learn to “pull over” just like a human would in similar conditions.
What are the main challenges , problems to be solved in computer vision field?
Currently the most important challenge faced by autonomous car in computer vision is how to run most of the algorithm in real time and that too in complex and cluttered environment. Making a vision model which can achieve high accuracy in no more a problem today because of deep learning but making iRead more
Currently the most important challenge faced by autonomous car in computer vision is how to run most of the algorithm in real time and that too in complex and cluttered environment. Making a vision model which can achieve high accuracy in no more a problem today because of deep learning but making it work in real time that too in more complex environments like in Indian roads is.
See lessToday computer vision algorithms for autonomous vehicles are getting a direct competition from LiDAR because LiDAR can solve most of the problems which is much complex for vision algorithms and at at same time can also help vehicles with mapping and localizing.
Vision fails in SLAM thus which is why its usage in autonomous vehicle is now limited for lane marking and generation and to make low cost ADAS systems.
Will self-driving technology from Wayve revolutionize autonomous vehicles?
The simple answer is “No, not at the moment” They claim that self-driving is possible with: No HD-Maps, No expensive sensor/compute suite, No hand-coded rules, Driving on roads never-seen during training. Replacing the entire self-driving pipeline with one end-to-end trained network seems to be impoRead more
The simple answer is “No, not at the moment”
They claim that self-driving is possible with:
No HD-Maps,
No expensive sensor/compute suite,
No hand-coded rules,
Driving on roads never-seen during training.
Replacing the entire self-driving pipeline with one end-to-end trained network seems to be impossible at this moment, but their case successfully demonstrates the potential of computer vision and deep learning approach under some contitions.
Getting a robotic system to work in the field is a very complicated game, which may take years and years to achieve stable results in city and urban driving, under numerous conditions (day, night, rain, snow etc), in different countries with their own local laws and traffic/driving patterns.
Anyway they have built a very strong team of professionals and are doing very well so far, check some of their videos on urban driving:
Link to their blog post:
See lesshttps://wayve.ai/blog/driving-like-human
Have driverles vehicles ever been tested at high speeds?
HI, yes, there are several great examples, one of them is Roborace Roborace is the world’s first competition for human + machine teams, using both self-driving and manually-controlled cars, and they set the Guinness World Record for Fastest Autonomous Car https://www.youtube.com/watch?v=zuzaabZLVSo
HI, yes, there are several great examples, one of them is Roborace
See lessRoborace is the world’s first competition for human + machine teams, using both self-driving and manually-controlled cars,
and they set the Guinness World Record for Fastest Autonomous Car
Can autononous cars developers program their cars to drive like the locals?
Well, driving like a local usually means doing some illegal driving maneuvers which used to be a cause of accidents Developers have to and will program cars to drive accordingly to the traffic rules Locals ignore many rules but they will have to get used to driverless cars that drive correctly
Well, driving like a local usually means doing some illegal driving maneuvers which used to be a cause of accidents
See lessDevelopers have to and will program cars to drive accordingly to the traffic rules
Locals ignore many rules but they will have to get used to driverless cars that drive correctly
What is the difference between – Model Predictive Control, Generalized Predictive Control and Long Range Predictive Control?
Generalized predictive control (GPC) is a member of the MPC family of methods, where the mathematical model of the system is a “controlled auto-regressive and integrated moving-average” (CARIMA) model. Long range predictive control (LRPC) seems (although I might be wrong since I am seeing this nameRead more
Generalized predictive control (GPC) is a member of the MPC family of methods, where the mathematical model of the system is a “controlled auto-regressive and integrated moving-average” (CARIMA) model.
Long range predictive control (LRPC) seems (although I might be wrong since I am seeing this name for the first time) to be used interchangeably with MPC (especially around 1990–2000).
Model predictive control (MPC) is a family of control methods based on real-time repeated optimal control. These methods are intended for solving multivariable, constrained, infinite horizon, and possibly nonlinear, optimal control problems approximately via finite horizon solutions implemented in receding horizon fashion. These finite horizon solutions involve optimizing the objective function for the (finite) prediction horizon, where the predictions are based on a mathematical model of the dynamical system to be controlled.
There are many variants of MPC (there may be more):
See less> model algorithmic control: MPC using an impulse-response model
> dynamic matrix control: MPC using a step-response model
> generalized predictive control: MPC using a CARIMA model
> hybrid MPC: MPC using a hybrid system model (i.e., with binary state variables and/or control inputs in addition to continuous ones)
> with uncertainty treatment: robust MPC, stochastic MPC
> with real-time system identification: adaptive MPC (model is updated online)
> with parametric optimization: explicit MPC (control law is computed offline and implemented as a look-up table)