Analyzing crowd events in a video is key to understanding the behavioral characteristics of people (humans). Detecting crowd events in videos is challenging because of articulated human movements and occlusions. The aim of this paper is to detect the events in a probabilistic framework for automatically interpreting the visual crowd behavior. In this paper, crowd event detection and classification in optical flow manifolds (OFMs) are addressed. A new algorithm to detect walking and running events has been proposed, which uses optical flow vector lengths in OFMs. Furthermore, a new algorithm to detect merging and splitting events has been proposed, which uses Riemannian connections in the optical flow bundle (OFB). The longest vector from the OFB provides a key feature for distinguishing walking and running events. Using a Riemannian connection, the optical flow vectors are parallel transported to localize the crowd groups. The geodesic lengths among the groups provide a criterion for merging and splitting events. Dispersion and evacuation events are jointly modeled from the walking/running and merging/splitting events. Our results show that the proposed approach delivers a comparable model to detect crowd events. Using the performance evaluation of tracking and surveillance 2009 dataset, the proposed method is shown to produce the best results in merging, splitting, and dispersion events, and comparable results in walking, running, and evacuation events when compared with other methods.