By default the data is available in the "Internal Broker", a RabbitMQ broker. Each organization has it own exchange associated with it.
To consume data being generated by your devices / sensors in your org from the cloud, you need the following
1. Your Exchange Details - Click Data > Data Destinations > Internal Broker
2. Grab "Observation Exchange Name"
3. Grab "Server name" without port details
4. Use the attached code and cert - put them in the same folder
5. Run as follows -
python data.py -r us.ciscokinetic.io -e <Observation Exchange Name> -u 'API Key before .' -P 'API Key after . '
6. See the options to make your Q Durable to see data when the consumer was down.
7. API Keys - Select "Data Management" Role with "Read Only" for consuming data
8. See for quick test with a cli data publisher -https://developer.cisco.com/docs/kinetic/#integrate-an-existing-app/integrate-an-existing-app