AI is only as good as the information it gets; if garbage goes in, garbage comes out. Unfortunately, acquiring good information about the physical world around an actor is not a trivial task, and calculating things like collisions and line of sight queries can take a fair bit of processing power. Providing such an interface between the low-level collision representation is typically the role of a sensory system.
In this 1h30 masterclass, you'll discover the big picture that a programmer needs to know about sensory systems — using examples from Thief: Deadly Shadows. In particular, what information do they provide for the AI to reason with, and how do they create an interface between the AI and other systems? Then, focusing on the low-level, you'll find out what it takes to build a good sensory system that can scale up and down depending on the computation power available on your target platform, and most importantly how it can manage sensory queries efficiently.