Smart City Intersections

Dense urban environments present unique challenges to the potential deployment of autonomous vehicles, including large number of vehicles moving at various speeds, obstructions which are opaque to in-vehicle sensors, and chaotic behavior of pedestrians. Smart city intersection will be at the core of an AI-powered traffic management system for crowded metropolises. COSMOS will provide all components needed for developing smart intersections, and will support cloud-connected vehicles to overcome the limitations of autonomous vehicles. In particular, COSMOS will enable vehicles to wirelessly share in-vehicle sensor data with other vehicles and the edge cloud servers. COSMOS will also deploy a variety of infrastructure sensors, including street-level and bird’s eye cameras, whose data will be aggregated by the servers. The servers will run real-time algorithms to monitor and manage traffic.

We devised an example experiment at COSMOS’ pilot location. It will rely on the concept of “radar screen” as a real-time evolving snapshot of positions and velocity vectors of objects in the intersection. The radar screen will be constructed by the edge cloud servers based on learning algorithms that will be dynamically distributed to various computing resources based on the application latency requirements and available bandwidth. The radar screen will be wirelessly broadcast to participants in the intersection within a constrained time period. Results of the experiments using bird’s eye cameras to detect and track vehicles and pedestrians from the COSMOS pilot site are reported in [1], where we assess the capabilities for real-time computation and detection and tracking accuracy, by evaluating and customizing video pre-processing and deep-learning algorithms.

Another example experiment where the cameras were used to assess compliance with social distancing policies during the COVID-19 pandemic appears in [2].

Smart City Intersections
Figure 1. Locations of the cameras and edge cloud servers at the COSMOS pilot intersection with an example “radar-screen” captured by a camera deployed on the 12th floor of the Columbia Mudd building, and the corresponding locations and velocity vectors of bicycles/cars/pedestrians in the “radar-screen” obtained using deep learning algorithms.

[1] Shiyun Yangy, Emily Bailey, Zhengye Yang, Jonatan Ostrometzky, Gil Zussman, Ivan Seskar, Zoran Kostic, “COSMOS Smart Intersection: Edge Compute and Communications for Bird’s Eye Object Tracking,” IEEE Percom – SmartEdge 2020, 4th International Workshop on Smart Edge Computing and Networking, Mar. 2020. [download]

[2] M. Ghasemi, Z. Kostic, J. Ghaderi, and G. Zussman, “Auto-SDA: Automated Video-based Social Distancing Analyzer,” in Proc. 3rd Workshop on Hot Topics in Video Analytics and Intelligent Edges (HotEdgeVideo’21) (to appear), 2021. [download]