Administrator

[Beta] How do I set up and manage Data streams?

  • Updated:
    info_outline
    Created:

Note: This is a beta feature; we welcome your feedback as you test it out. As a beta feature, this process and content is subject to change.

Related article: [Beta] How do I filter Data streams to send specific Events and Properties?

A Data stream is a process to transmit Buzz system events (formatted as JSON objects) that can be used for data analysis, statistical tracking, or synchronization with another system.

In Buzz, Administrators can use different Stream types and configure multiple Data streams to allow data for various Events to be sent, near real-time, to one or more third-party services.

Use Buzz's Data streams to collect, analyze, process, and—ultimately—leverage data to:

  • Streamline processes.
  • Complete statistical research and compliance reporting.
  • Improve student experience and teacher effectiveness.

Stream types

Buzz is set up to deliver data using the following Stream types:

  • Amazon Kinesis Data Firehose: Amazon Kinesis Data Firehose is an extract, transform, and load (ETL) service that reliably captures, transforms, and delivers streaming data to data lakes, data stores, and analytics services.
  • Amazon Kinesis Data Stream: Amazon Kinesis Data Streams is a serverless streaming data service.
  • Amazon Simple Queue Service (SQS): Amazon Simple Queue Service (SQS) lets you send, store, and receive messages between software components at any volume, without losing messages or requiring other services to be available.
  • HTTPS: HTTPS streaming allows data to be sent continuously to a client over a single HTTPS connection that remains open indefinitely.

Event types

The Data stream can send various events related to domain, course, user, and enrollment data. To learn more about all of the available events and how you can filter your stream based upon these events, see [Beta] How do I filter Data streams to send specific Events and Properties?

  • Domain data Events:  
    • Domain created
    • Domain changed
    • Domain deleted
  • Course data Events:
    • Course created
    • Course changed
    • Course deleted
  • User data Events:
    • User created
    • User changed
    • User deleted
  • Enrollment data Events:
    • Enrollment created
    • Enrollment changed
    • Enrollment deleted

Configure Data streams with Amazon Kinesis Data Firehose

When setting up your Amazon Kinesis Data Firehose account for our API servers to write into, you need to configure cross-account access.

In order to do this:

  1. First, set up the firehose as you would normally for internal use, following AWS instructions.
    • Use Direct PUT firehose.
    • This setup will create an IAM Role as part of the setup process. This IAM Role is the one the firehose service runs under and gives it access to S3, Redshift, Elasticsearch, etc., and is not the one used by us (or you) to write data to the firehose.
  2. Next, create a cross-account role in your account that establishes a limited trust relationship between our AWS account and your own.
    • The session duration for this role should be 12 hours.
    • The following permissions are required for this role: firehose:PutRecord and firehose:PutBatchRecord on the target stream.
    • The firehose:ListDeliveryStreams may also be useful for diagnosing access issues, but is not required by the Agilix API service.
  3. Lastly, add Trust Relationship permissions for Agilix's AWS account for sts:AssumeRole on the cross-account role you just created.
    • Contact your Agilix sales representative or support to coordinate exchanging this information.

To configure Data streams in Buzz using Amazon Kinesis Data Firehose:

  1. Open the More menu in Admin > Domain.
  2. Select Data streams.
  1. Provide a Description of the Data stream. The Description is for your use and should be a brief descriptor of what the Data stream is for and/or how you're using it.
    • Example: If you're setting up a Data stream to deliver all new domain data to an admin dashboard, your Description might be AdminDashboard_NewDomains_Firehose.
  2. Select Amazon Kinesis Data Firehose as your Stream type.
  3. Provide the Stream name.
    • This is an ASW identifier provided by AWS; it is the name of the Kinesis Data Firehose Delivery Stream you want to put event records into.
  4. Provide your ARN role.
    • You can find you ARN (Amazon Resource Name) role through AWS.
  5. Check the Enabled box to ensure the Data stream to begins working when you Save.
    • If you're not ready to enable the stream, you can leave the box unchecked and Save the configuration, and the Data stream will not begin sending Events.
  6. Click Add filter if you want to limit the amount of data sent to meet your specific needs and optimize storage.
  7. Once you're done configuring Data stream, you can:
    • Click Test to make sure the Data stream is set up correctly and working. This sends a single event, so you can verify that your configuration is correct.
    • Click Save. If the Enabled box is checked, your Data stream begins sending data; if not, your configuration is saved, and no data is sent.

Note: If you Enable and Save your Data stream without adding filters, Buzz automatically sends data for all Events to the defined destination. This can result in large amounts of unnecessary storage.

Configure Data streams with Amazon Kinesis Data Stream

When setting up your Amazon Kinesis Data Stream account for our API servers to write into, you need to configure cross-account access.

In order to do this:

  1. First, set up the Kinesis Data Stream as you would normally for internal use, following AWS instructions.
  2. Next, create a cross-account role in your account that establishes a limited trust relationship between our AWS account and your own.
    • The session duration for this role should be 12 hours.
    • The following permissions are required for this role: kinesis:PutRecord and kinesis:PutRecords on the target stream.
    • The kinesis:ListStreams may also be useful for diagnosing access issues, but is not required by the Agilix API service.
  3. Lastly, add Trust Relationship permissions for Agilix's AWS account for sts:AssumeRole on the cross-account role you just created.
    • Contact your Agilix sales representative or support to coordinate exchanging this information.

To configure Data streams in Buzz using Amazon Kinesis Data Stream:

  1. Open the More menu in Admin > Domain.
  2. Select Data streams.
  1. Provide a Description of the Data stream. The Description is for your use and should be a brief descriptor of what the Data stream is for and/or how you're using it.
    • Example: If you're setting up a Data stream to deliver all new domain data to an admin dashboard, your Description might be AdminDashboard_NewDomains_DataStream.
  2. Select Amazon Kinesis Data Stream as your Stream type.
  3. Provide the Stream name.
    • This is an ASW identifier provided by AWS; it is the name of the Kinesis Data Stream you want to put event records into.
  4. Provide your ARN role.
    • You can find you ARN (Amazon Resource Name) role through AWS.
  5. Check the Enabled box to ensure the Data stream to begins working when you Save.
    • If you're not ready to enable the stream, you can leave the box unchecked and Save the configuration, and the Data stream will not begin sending Events.
  6. Click Add filter if you want to limit the amount of data sent to meet your specific needs and optimize storage.
  7. Once you're done configuring Data stream, you can:
    • Click Test to make sure the Data stream is set up correctly and working. This sends a single event, so you can verify that your configuration is correct.
    • Click Save. If the Enabled box is checked, your Data stream begins sending data; if not, your configuration is saved, and no data is sent.

Note: If you Enable and Save your Data stream without adding filters, Buzz automatically sends data for all Events to the defined destination. This can result in large amounts of unnecessary storage.

Configure Data streams with Amazon Simple Queue Service (SQS)

When setting up your Amazon SQS Queue for our API servers to write into, you need to configure cross-account access.

In order to do this:

  1. First, set up the SQS Queue as you would normally for internal use, following AWS instructions.
  2. Next, create a cross-account role in your account that establishes a limited trust relationship between our AWS account and your own.
    • The session duration for this role should be 12 hours.
    • The following permissions are required for this role: sqs:GetQueueUrl and sqs:SendMessage on the target stream.
    • The sqs:ListQueues may also be useful for diagnosing access issues, but is not required by the Agilix API service.
  3. Lastly, add Trust Relationship permissions for Agilix's AWS account for sts:AssumeRole on the cross-account role you just created.
    • Contact your Agilix sales representative or support to coordinate exchanging this information.

To configure Data streams in Buzz using Amazon Kinesis Data Stream:

  1. Open the More menu in Admin > Domain.
  2. Select Data streams.
  1. Provide a Description of the Data stream. The Description is for your use and should be a brief descriptor of what the Data stream is for and/or how you're using it.
    • Example: If you're setting up a Data stream to deliver all new domain data to an admin dashboard, your Description might be AdminDashboard_NewDomains_SQS.
  2. Select Amazon Simple Queue Service as your Stream type.
  3. Provide the Stream name.
    • This is an ASW identifier provided by AWS; it is the name of the SQS Data Stream you want to put event records into.
  4. Provide your ARN role.
    • You can find you ARN (Amazon Resource Name) role through AWS.
  5. Check the Enabled box to ensure the Data stream to begins working when you Save.
    • If you're not ready to enable the stream, you can leave the box unchecked and Save the configuration, and the Data stream will not begin sending Events.
  6. Click Add filter if you want to limit the amount of data sent to meet your specific needs and optimize storage.
  7. Once you're done configuring Data stream, you can:
    • Click Test to make sure the Data stream is set up correctly and working. This sends a single event, so you can verify that your configuration is correct.
    • Click Save. If the Enabled box is checked, your Data stream begins sending data; if not, your configuration is saved, and no data is sent.

Note: If you Enable and Save your Data stream without adding filters, Buzz automatically sends data for all Events to the defined destination. This can result in large amounts of unnecessary storage.

Configure Data streams with HTTPS

To configure Data streams in Buzz using Amazon Kinesis Firehose and Data stream:

  1. Open the More menu in Admin > Domain.
  2. Select Data streams.
  1. Provide a Description of the Data stream. The Description is for your use and should be a brief descriptor of what the Data stream is for and/or how you're using it.
    • Example: If you're setting up a Data stream to deliver all new domain data to an admin dashboard, your Description might be AdminDashboard_NewDomains_SISName.example.com.
  2. Select HTTPS.
    • HTTPS streaming allows data to be sent continuously to a client over a single HTTPS connection that remains open indefinitely.
  3. Provide a Stream name. This is an internal unique identifier for this stream and should be an abbreviated reference to the URL(s) that are receiving the data.
    • Example: If you are sending the data to a student information system, you might use SISName.example.com_AdminDashboard.
  4. Provide the:
    • Timeout seconds, or the amount of time you want to allow for the HTTPS connection.
    • Retries, or the number of retries you want to allow the HTTPS to try to connect.  
  5. Provide the Endpoints, or URL(s) to which the data is being sent. You can have up to five Endpoints separated by semicolons, each acting as a backup. Your Data stream attempts to send data to each Endpoint in order until one is successfully reached.
  6. Select the HTTP method: POST or PUT
  7. Check the Enabled box to ensure the Data stream to begins working when you Save.
    • If you're not ready to enable the stream, you can leave the box unchecked and Save the configuration, and the Data stream will not begin sending Events..
  8. Click Add filter if you want to limit the amount of data sent to meet your specific needs and optimize storage.
  9. Once you're done configuring Data stream, you can:
    • Click Test to make sure the Data stream is set up correctly and working. This sends a single event, so you can verify that your configuration is correct.
    • Click Save. If the Enabled box is checked, your Data stream begins sending data; if not, your configuration is saved, and no data is sent.

Note: If you Enable and Save your Data stream without adding filters, Buzz automatically sends data for all Events to the defined destination. This can result in large amounts of unnecessary storage.

Learn more: [Beta] How do I filter Data streams to send specific Events and Properties?

Filter your Data streams to send only the data you need

Because Data streams automatically send data that you will be storing, Buzz lets you filter the data by Events and Properties within those Events to avoid using storage for data you don't need.

Learn more: [Beta] How do I filter Data streams to send specific Events and Properties?

Interpret and map Data stream content for your use

Data streams deliver data as Event records. Each Event record is always wrapped in a standard container, which you can see in the following example (all the top-level members are part of the standard container):

{
	"time":"2022-06-29T16:40:30.9926931Z",
	"guid":"c7a820f6-36ed-48f6-8210-3581238d30a6",
	"domainId":"1176022",
	"type":"DomainEntityChanged",
	"data":{... },
	"userId":"93275",
	"agentUserId":"2239",
	"sessionId":"8655695467559064423",
}

Note: The above record is formatted for readability, but when delivered each record in the data stream appears as a single line with minimal whitespace.

The following table defines each member of the standard container.

Member Description
time The UTC time of the Event.
guid A GUID that uniquely identifies the action, even across Data streams and domains. 
domainId The ID of the domain where the Event occurred. 
type The type of the Event, which will identify the schema used for the data Property.
data An object with details about the Event that is different for each Event type
userId The ID of the authorized user who performed the action (may be missing if the action was performed internally by the system itself). If a user was proxying another user, this is the ID of the user being proxied. 
agentUserId The ID of the agent user who performed the action (may be missing if the action was performed internally by the system itself). If a user was proxying another user, this is the ID of the user acting as proxy. 
sessionId The ID of the user session that made the request (may be missing if the action was performed internally by the system itself). 
forum

Have a question or feedback? Let us know over in Discussions!