Please use this identifier to cite or link to this item: http://biblioteca.unisced.edu.mz/handle/123456789/3098
Full metadata record
DC FieldValueLanguage
dc.contributor.authorFingscheidt, Tim-
dc.contributor.authorGottschalk, Hanno-
dc.contributor.authorHouben, Sebastian-
dc.date.accessioned2023-09-29T15:39:20Z-
dc.date.available2023-09-29T15:39:20Z-
dc.date.issued2022-01-29-
dc.identifier.urihttp://biblioteca.unisced.edu.mz/handle/123456789/3098-
dc.description.abstractThis open access book brings together the latest developments from industry and research on automated driving and artificial intelligence. Environment perception for highly automated driving heavily employs deep neural networks, facing many challenges. How much data do we need for training and testing? How to use synthetic data to save labeling costs for training? How do we increase robustness and decrease memory usage? For inevitably poor conditions: How do we know that the network is uncertain about its decisions? Can we understand a bit more about what actually happens inside neural networks? This leads to a very practical problem particularly for DNNs employed in automated driving: What are useful validation techniques and how about safety? This book unites the views from both academia and industry, where computer vision and machine learning meet environment perception for highly automated driving. Naturally, aspects of data, robustness, uncertainty quantification, and, last but not least, safety are at the core of it. This book is unique: In its first part, an extended survey of all the relevant aspects is provided. The second part contains the detailed technical elaboration of the various questions mentioned above.en_US
dc.language.isoenen_US
dc.publisherSpringer Natureen_US
dc.titleDeep Neural Networks and Data for Automated Drivingen_US
dc.typeBooken_US
Appears in Collections:Inteligência Artificial

Files in This Item:
File Description SizeFormat 
978-3-031-01233-4.pdf12.16 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.