With advances in network capabilities, the gaming industry is increasingly turning towards offering “gaming on demand” solutions, with cloud gaming services such as Sony PlayStation Now, Google Stadia, and NVIDIA GeForce NOW expanding their market offerings. Similar to adaptive video streaming services, cloud gaming services typically adapt the quality of game streams (e.g., bitrate, resolution, frame rate) in accordance with current network conditions. To select the most appropriate video encoding parameters given certain conditions, it is important to understand their impact on Quality of Experience (QoE). On the other hand, network operators are interested in understanding the relationships between parameters measurable in the network and cloud gaming QoE, to be able to invoke QoE-aware network management mechanisms. To encourage developments in these areas, comprehensive datasets are crucial, including both network and application layer data. This paper presents CGD, a dataset consisting of 600 game streaming sessions corresponding to 10 games of different genres being played and streamed using the following encoding parameters: bitrate (5, 10, 20 Mbps), resolution (720p, 1080p), and frame rate (30, 60 fps). For every combination repeated five times for each game, the dataset includes: 1) gameplay video recordings, 2) network traffic traces, 3) user input logs (mouse and keyboard), and 4) streaming performance logs., The cloud gaming dataset is composed of 600 gameplay videos of 10 different games belonging to a wide range of genres. Alongside raw video files (the FRAPS application was used to losslessly record gameplay sessions), the dataset also consists of captured network traffic during these cloud gaming sessions (captured using Wireshark tool), corresponding users input (collected by MacroRecorder application), and application-level statistic summaries of sessions gathered from Steam application.