Event Streaming reference
A stream lets an application subscribe to database changes in real-time. Fauna sends the application an event whenever tracked changes are made.
Streams are useful for building features that need to react to data changes, such as:
-
Real-time dashboards
-
Chat apps
-
Pub/sub integration
-
Multiplayer games
Stream basics
You typically create and subscribe to streams using a
Fauna client driver. To create a stream, you
call toStream()
or changesOn()
on a Set in an FQL query.
The Set used for a stream must be from a supported source. The exact behavior of each method depends on this source. Source Sets for streams support a limited number of transformations and filters.
When the query runs, Fauna sends the client a stream token. To start and
subscribe to the stream, the client sends a request containing the stream token
to Fauna’s /stream/1
HTTP API endpoint. Client drivers handle this request
silently.
In response, Fauna sends a status
event, indicating the stream has
started.
{
"type": "status",
"txn_ts": 1710968002310000,
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
The /stream/1
request’s connection remains open. If a tracked change occurs,
Fauna sends a related add
, remove
, or update
event. These events
include the triggering document in the data
field.
{
"type": "update",
"data": {
"@doc": {
"id": "392914348360597540",
"coll": { "@mod": "Product" },
"ts": { "@time": "2024-03-21T12:35:18.680Z" },
"name": "pizza",
"description": "Frozen Cheese",
...
}
},
"txn_ts": 1711024518680000,
"stats": {
...
}
}
Supported sources
You can only create a stream on a Set from a supported source. The source
affects the exact behavior of toStream()
or changesOn()
.
Supported source | Behavior |
---|---|
|
Collection streams
Calling toStream()
directly on
all()
tracks any change to any document in the collection.
The following query tracks any change to documents in the Product
collection.
Product.all().toStream()
For example, if you change a Product
document’s price
to below 100.00
,
Fauna sends an update
event to the stream.
You can use
where()
to
filter the tracked documents for a collection.
For example, the following query only tracks
Product
documents with a price
of less than 100.00
.
Product.where(.price < 100.00).toStream()
If you change a Product
document’s price
from above 100.00
to below
100.00
, Fauna sends an add
event to the stream. Before the change, the
document would not have been part of the stream’s Set.
You can use changesOn()
to only track changes to specific fields.
The following query tracks changes made to any Product
document’s
description
. The stream doesn’t send events for changes to other fields.
Product.all().changesOn(.description)
Index streams
Index streams only send events for changes to the index’s terms or values fields.
For example, the following Product
collection’s byCategory
index has:
-
A term field of
category
-
Value fields of
name
andprice
collection Product {
index byCategory {
terms [.category]
values [.name, .price]
}
...
}
The following query only tracks changes to the category
, name
, or price
fields for Product
documents with a category
of sports
.
Product.byCategory('sports').toStream()
When called on an index, changesOn()
only accepts the index’s terms or
values fields as arguments.
For example, in the following query, changesOn()
only accepts
.category
, .name
, or .price
as arguments.
Product.byCategory('sports').changesOn(.category, .name)
Document streams
You can use streams to track changes to a Set containing a single document. These streams are only sent events when the document changes.
Use Set.single()
to create
a Set from a Document.
let product = Product.byId(392174614626697728)!
Set.single(product).toStream()
You can use changesOn()
to only track changes to specific fields of the
document.
let product = Product.byId(392174614626697728)!
Set.single(product).changesOn(.name, .price)
Supported transformations and filters
Streams only support source Sets that are transformed or filtered using:
This ensures Fauna can convert the Set to a stream. Sets using unsupported transformations or filters will fail to convert.
Product.all().drop(10).toStream()
Running the query returns the following error:
error: can't call `.toStream()` because the source set shape can't be converted to a stream
at *query*:1:32
|
1 | Product.all().drop(10).toStream()
| ^^
|
Filters
For example, the following query only tracks changes to Product
documents
with:
-
A
category
ofsports
-
A
price
less than100.00
Product
.all()
.where(.category == 'sports')
.where(.price < 100.00)
.toStream()
You can also call where()
directly on toStream()
or changesOn()
. The following query is equivalent to
the previous one.
Product
.all()
.toStream()
.where(.category == 'sports')
.where(.price < 100)
where()
produces a new
Set based on its criteria. The criteria affect the event types
sent for changes:
-
Creating a document in the Set produces an
add
event. -
Updating a document so that it moves into the Set produces an
add
event. -
Updating a document so that it remains in the Set produces an
update
event. -
Updating a document so that it moves out of the Set produces a
remove
event. -
Deleting a document from the Set produces a
remove
event. -
Any other changes produce no events.
While filters affect events sent for a stream, they don’t affect event processing, which impacts performance and cost. See How filters affect costs and performance.
Projection
A stream’s add
and update
event types include a data
field. This field
contains the document that triggered the event.
You can use map()
or projection to return only
specific document fields in these events.
For example, the following query tracks changes to any field in any Product
document. The query uses map()
to only include the name
and price
document fields in the data
field of
add
and update
events.
Product
.all()
.map(product => {
name: product.name,
price: product.price
})
.toStream()
The following query uses projection and is equivalent to the previous one.
let products = Product.all() { name, price }
products.toStream()
The previous queries can produce the following add
event. The event’s data
field include only the name
and price
document fields.
{
"type": "add",
"data": { "name": "pizza", "price": "15.99" },
"txn_ts": 1711028312060000,
"stats": {
"read_ops": 1,
"storage_bytes_read": 69,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Events
Streams send one event per document per transaction.
Event order
Events are ordered by ascending txn_ts
(transaction timestamp). Events from
the same transaction share the same txn_ts
, but their order may differ across
clients.
Event types
The following table outlines supported event types.
Event type | Sent when … |
---|---|
|
A stream starts or reconnects. It’s also sent periodically to:
|
|
A document is added to the Set. |
|
A document is removed from the Set. |
|
A document in the Set changes. |
Event schema
Events have the following schema.
{
"type": "add",
"txn_ts": 1710968002310000,
"data": {
"@doc": {
"id": "392914348360597540",
"coll": { "@mod": "Product" },
"ts": { "@time": "2024-03-20T21:46:12.580Z" },
"foo": "bar"
}
},
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
Field name | Type | Description |
---|---|---|
|
Event type: |
|
|
The related transaction’s commit time in microseconds since the Unix epoch. |
|
|
Document that triggered the event. The |
|
|
Event statistics. See |
stats
fields
Field name | Type | Description |
---|---|---|
|
Transactional Read Operations consumed by the event. |
|
|
Amount of data read from storage, in bytes. |
|
|
Transactional Compute Operations consumed by the event. |
|
|
Event processing time in milliseconds. |
|
|
Operations that exceeded their rate limit. See Global limits. |
How stream subscriptions work
A client subscribes to a stream in two steps:
-
Create a stream by calling
toStream()
orchangesOn()
on a supported Set in an FQL query.Product.all().toStream()
The query creates and returns a stream token.
"g9WD1YPG..."
-
Start and subscribe to the stream by submitting a GET request to Fauna’s
/stream/1
HTTP API endpoint with the stream token.curl https://db.fauna.com/stream/1 \ -H 'Authorization: Bearer <accessToken>' \ -d '{ "token": "<streamToken>" }'
When the stream starts, Fauna responds with a
status
event.{ "type": "status", "txn_ts": 1710968002310000, "stats": { "read_ops": 8, "storage_bytes_read": 208, "compute_ops": 1, "processing_time_ms": 0, "rate_limits_hit": [] } }
The connection remains open so Fauna can notify the client of new events.
Fauna’s client drivers perform the above steps in a single API call.
If a change occurs between the creation of the stream token and the start of a stream, Fauna replays and sends the related events when the stream starts.
Stream disconnection
Fauna’s client drivers can detect connection loss and automatically reconnect disconnected streams. Events for changes that occur during network issues are replayed and sent when the stream reconnects.
When a stream reconnects, Fauna sends a new status
event.
{
"type": "status",
"txn_ts": 1710968002310000,
"stats": {
"read_ops": 8,
"storage_bytes_read": 208,
"compute_ops": 1,
"processing_time_ms": 0,
"rate_limits_hit": []
}
}
To support reconnects, the /stream/1
HTTP API endpoint supports an
optional start_ts
request body parameter. start_ts
is an Int
representing the stream start time in microseconds since the Unix epoch.
curl https://db.fauna.com/stream/1 \
-H 'Authorization: Bearer <accessToken>' \
-d '
{
"token": "<streamToken>",
"start_ts": 1710968002310000
}'
start_ts
must be later than the creation time of the stream token.
The period between the stream restart and the start_ts
can not exceed the
history_days
value for source Set's collection. If a collection’s
history_days
is 0
or unset, the period can’t exceed 15 minutes.
Costs and performance
A stream’s cost and performance are closely related to its shape. A stream’s shape is defined by:
-
The source Set
-
Transformations and filters applied to the source Set
Processing and sending events for streams consume Transactional Read and Transactional Compute Operations. The exact number of operations consumed varies based on the stream’s shape.
Depending on its cardinality and throughput, subscribing to a stream for a large Set may cause delays in event delivery and consume more operations.
If a stream replays events after a reconnect, it may also consume additional operations.
Each stream event includes stats
fields that report consumed operations. If
you exceed your Fauna’s or your plan’s operations limit, Fauna closes the
stream with an error.
How filters affect costs and performance
Streams may discard events based on filters.
For example, a stream with the following query uses a filter to only send
events for Product
documents with a category
of sports
.
Product
.all()
.where(.category == 'sports')
.toStream()
To do this, Fauna processes an event for any change to any Product
document. It then discards events for documents without a category
of
sports
. These discarded events still consume operations for your account.
To track change for a stream on a large Set, it’s recommended to use an index stream.
For example, the following stream produces events similar to the previous one. However, it only tracks covered term and value fields, resulting in potentially higher performance and lower costs.
Product
.byCategory('sports')
.toStream()
Another source of discarded events is privilege predicates in roles. For
example, the following role uses predicates to grant its members read and
write access only to Product
documents with a category
of sports
.
role SportsManager {
privileges Product {
write {
predicate ((product, _) => product.category == "sports")
}
read {
predicate (product => product.category == "sports")
}
}
}
A stream using an access key with this role is only sent events for documents the role can access. Other events are discarded. These discarded events still consume operations for your account.
Limitations
-
Operation limits apply to streams.
-
While processing events, Fauna runs one query per transaction.
-
Streams are limited to 128 transactions at a time. If a stream has more than 128 transactions to process, Fauna closes the stream.
Is this article helpful?
Tell Fauna how the article can be improved:
Visit Fauna's forums
or email docs@fauna.com
Thank you for your feedback!