The objective of this TU is not to show all the possible sensors and actuators that can be used in
AI, but to understand how to apply some of the most relevant and start thinking about the type
of smart applications that can be developed with them.
To this end, in the current TU, students will develop a smartphone app that uses the three most
important sensors in AI, cameras, microphones and tactile screens, as well as two very relevant
actuators, speakers and LCD screens. With them, students will create a simple but useful app
that exploits human-machine interaction to capture and show information to the user in a
natural way. This app will be improved in the next TU, so its real utility will be clear then.
Teaching unit 2 tested at Universidade da Coruña! Scan the QR and orient yourself with the compass. Is that all? Wait until teaching unit 3 to see the final…