Motor Imagery (MI) of the upper limbs has been extensively studied in the literature using static visual cues. However, these protocols do not address movements involved in Activities of Daily Living (ADLs), which represent an open challenge for the scientific community. Action Observation (AO) has been proposed as a way to improve the recognition of complex tasks. Therefore, this study proposes a methodology to analyze Electroencephalography (EEG) during MI based on manipulation of a cup in different positions, through a first-person representation of a 2D virtual reality. The Power Spectral Density (PSD), Event Related Desynchronization/Synchronization (ERD/ERS), and Signal-to-Noise Ratio (SNR) were used to determine differences between MI of Rest and the movements. The results showed significant differences among MI tasks in the mu and beta frequency bands, as well as ERD/ERS mainly marked in the cortico-motor and parieto-central regions of the brain. These findings are important for the implementation of Brain Computer Interfaces (BCIs) based on MI and complex movements, which could improve usability and controllability in motor rehabilitation. Overall, this study demonstrates the potential of using AO-based MI protocols for analyzing brain activity during complex movements, which can enhance our understanding of motor imagery and inform the development of new rehabilitation strategies.