Brain-computer interface (BCI) is a technology used in various fields to analyze electroencephalography (EEG) signals to recognize an individual"s intention or state and control a computer or machine. However, most of the research on BCI is on motor imagery, and research on active movement is concentrated on upper limb movement. In the case of lower limb movement, most of the research is on the static state or single movements. Therefore, in this research, we developed a deep-learning model for classifying walking behavior(1: walking, 2: upstairs, 3: downstairs) based on EEG signals in a dynamic environment to verify the possibility of classifying EEG signals in a dynamic state. We developed a model that combined a convolutional neural network (CNN) and a bidirectional long short-term memory (BiLSTM). The model obtained an average recognition performance of 82.01%, with an average accuracy of 93.77% for walking, 76.52% for upstairs, and 75.75% for downstairs. It is anticipated that various robotic devices aimed at assisting people with disabilities and the elderly could be designed in the future with multiple features, such as human-robot interaction, object manipulation, and path-planning utilizing BCI for control.