Skip to content

RTA use cases #3835

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
54d7f2b
chore: more.
billy-the-fish Feb 10, 2025
de5f519
chore: bit more stuff
billy-the-fish Feb 10, 2025
841a3d3
Merge branch 'release-2.18.0-main' of github.com:timescale/docs into …
billy-the-fish Feb 10, 2025
1eb91d7
chore: bit more stuff
billy-the-fish Feb 10, 2025
e60ce73
chore: bit more stuff
billy-the-fish Feb 11, 2025
267aacb
chore: bit more stuff
billy-the-fish Feb 12, 2025
9cc60ea
chore: bit more stuff
billy-the-fish Feb 12, 2025
e467c6e
chore: remove collapsible sections and align with prerequisites.
billy-the-fish Feb 13, 2025
f28507a
fix: update vale vocab file for 3.x structure.
billy-the-fish Feb 13, 2025
802eb83
Apply suggestions from code review
billy-the-fish Feb 13, 2025
bf563f9
Update tutorials/index.md
billy-the-fish Feb 13, 2025
7fbec32
fix: add vocab for old version of vale.
billy-the-fish Feb 13, 2025
849aa73
fix: on review.
billy-the-fish Feb 13, 2025
e7013aa
Merge branch 'release-2.18.0-main' of github.com:timescale/docs into …
billy-the-fish Feb 14, 2025
30a9e2f
All docs language updates
atovpeko Feb 17, 2025
92f4429
Merge branch 'release-2.18.0-main' into 239-docs-rfc-create-a-use-cas…
billy-the-fish Feb 20, 2025
160106e
chore: newline
billy-the-fish Feb 20, 2025
5236f60
chore: add hypercore section.
billy-the-fish Feb 25, 2025
3ff21c2
Merge branch 'release-2.18.0-main' into 239-docs-rfc-create-a-use-cas…
billy-the-fish Feb 25, 2025
2e48be6
chore: more updates.
billy-the-fish Mar 6, 2025
4195d9d
chore: remove repeated includes.
billy-the-fish Mar 7, 2025
a9d824b
chore: move to crypto dataset.
billy-the-fish Mar 10, 2025
5debaf4
chore: update on review.
billy-the-fish Mar 10, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/styles/Vocab/Timescale/accept.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
Aiven
Alertmanager
API
api
Anthropic
async
[Aa]utoscal(?:e|ing)
Expand All @@ -16,6 +17,7 @@ https?
BSD
[Cc]allouts?
COPY
[Cc]opy
[Cc]lickstreams?
Cloudformation
Cloudwatch
Expand Down
159 changes: 159 additions & 0 deletions .github/styles/config/vocabularies/Timescale/accept.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,159 @@
[Aa]ccessor
Aiven
Alertmanager
API
api
Anthropic
async
[Aa]utoscal(?:e|ing)
[Bb]ackfill(?:ing)?
cron
csv
https?
(?i)bigint
[Bb]itemporal
[Bb]lockchains?
[Bb]oolean
BSD
[Cc]allouts?
COPY
[Cc]opy
[Cc]lickstreams?
Cloudformation
Cloudwatch
config
[Cc]onstif(?:y|ied)
[Cc]rypto
[Cc]ryptocurrenc(?:y|ies)
Datadog
[Dd]efragment(:?s|ed)?
Django
distincts
Docker
[Dd]ownsampl(?:e|ing)
erroring
Ethereum
[Ff]ailover
[Ff]inalizers?
[Ff]orex
[Gg]apfill(?:ing)?
[Gg]eospatial
GitHub
GNU
Grafana
GUC
gsed
gzip(?:ped)?
Hasura
HipChat
[Hh]ostname
href
[Hh]yperloglog
[Hh]yperfunctions?
[Hh]ypertables?
[Hh]ypershift
img
Indri
[Ii]nserter
[Ii]ntraday
[Iin]validation
ivfflat
jpg
JDBC
JDK
JSON
Kafka
Kaggle
Kinesis
[Ll]ambda
LangChain
LlamaIndex
LLMs
[Ll]ookups?
loopback
[Mm]atchers?
[Mm]aterializer
(?i)MST
matplotlib
[Mm]utators?
Nagios
[Nn]amespaces?
[Nn]ullable
Outflux
[Pp]ageviews?
[Pp]aralleliza(?:ble|tion)
[Pp]athname
Patroni
Paulo
[Pp]erformant
pg_dump
pg_restore
[Pp]gvector
[Pp]laintext
Plotly
pre
POSIX
PostgreSQL
[Pp]ooler?
Prometheus
PromLens
PromQL
Promscale
Protobuf
psql
[Qq]uantiles?
qStudio
RDS
[Rr]edistribut(?:e|able)
[Rr]eindex(?:ed)?
reltuples
[Rr]eusability
[Rr]unbooks?
[Ss]crollable
Sequelize
[Ss]ignups?
[Ss]iloed
(?i)smallint
sed
src
[Ss]ubquer(?:y|ies)
[Ss]ubsets?
[Ss]upersets?
[Tt]ablespaces?
Telegraf
Thanos
Threema
[Tt]iering
[Tt]imevectors?
Timescale(?:DB)?
tobs
[Tt]ransactionally
tsdbadmin
Uber
[Uu]nary
[Uu]ncomment
[Uu]nencrypted
Unix
[Uu]nmaterialized
[Uu]nregister
[Uu]nthrottled?
[Uu]ntier
[Uu]pserts?
[Rr]ebalanc(?:e|ing)
[Rr]epos?
[Rr]ollups?
[Ss]erverless
[Ss]hard(?:s|ing)?
SkipScan
(?i)timestamptz
URLs?
URIs?
UUID
versionContent
[Vv]irtualenv
WAL
[Ww]ebsockets?
Worldmap
www
Zabbix
Zipkin
11 changes: 6 additions & 5 deletions _partials/_add-data-blockchain.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,14 @@
## Load financial data

This tutorial uses Bitcoin transactions from the past five days.

## Ingest the dataset
The dataset contains around 1.5 million Bitcoin transactions, the trades for five days. It includes
information about each transaction, along with the value in [satoshi][satoshi-def]. It also states if a
trade is a [coinbase][coinbase-def] transaction, and the reward a coin miner receives for mining the coin.

To ingest data into the tables that you created, you need to download the
dataset and copy the data to your database.

<Procedure>

### Ingesting the dataset

1. Download the `bitcoin_sample.zip` file. The file contains a `.csv`
file that contains Bitcoin transactions for the past five days. Download:

Expand All @@ -37,3 +35,6 @@ dataset and copy the data to your database.
resources.

</Procedure>

[satoshi-def]: https://www.pcmag.com/encyclopedia/term/satoshi
[coinbase-def]: https://www.pcmag.com/encyclopedia/term/coinbase-transaction
2 changes: 0 additions & 2 deletions _partials/_add-data-energy.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,6 @@ into the `metrics` hypertable.

<Procedure>

### Loading energy consumption data

<Highlight type="important">
This is a large dataset, so it might take a long time, depending on your network
connection.
Expand Down
2 changes: 0 additions & 2 deletions _partials/_add-data-nyctaxis.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,6 @@ When you have your database set up, you can load the taxi trip data into the

<Procedure>

### Loading trip data

<Highlight type="important">
This is a large dataset, so it might take a long time, depending on your network
connection.
Expand Down
35 changes: 19 additions & 16 deletions _partials/_add-data-twelvedata-crypto.md
Original file line number Diff line number Diff line change
@@ -1,29 +1,29 @@
## Load financial data

This tutorial uses real-time cryptocurrency data, also known as tick data, from
[Twelve Data][twelve-data]. A direct download link is provided below.
[Twelve Data][twelve-data]. To ingest data into the tables that you created, you need to
download the dataset, then upload the data to your $SERVICE_LONG.

### Ingest the dataset
<Procedure>

To ingest data into the tables that you created, you need to download the
dataset and copy the data to your database.

<Procedure>
1. Unzip <Tag type="download">[crypto_sample.zip](https://assets.timescale.com/docs/downloads/candlestick/crypto_sample.zip)</Tag> to a `<local folder>`.

#### Ingesting the dataset
This test dataset contains second-by-second trade data for the most-traded crypto-assets
and a regular table of asset symbols and company names.

1. Download the `crypto_sample.zip` file. The file contains two `.csv`
files; one with company information, and one with real-time stock trades for
the past month. Download:
<Tag
type="download">[crypto_sample.zip](https://assets.timescale.com/docs/downloads/candlestick/crypto_sample.zip)
</Tag>
To import up to 100GB of data directly from your current PostgreSQL based database,
[migrate with downtime][migrate-with-downtime] using native PostgreSQL tooling. To seamlessly import 100GB-10TB+
of data, use the [live migration][migrate-live] tooling supplied by $COMPANY. To add data from non-PostgreSQL
data sources, see [Import and ingest data][data-ingest].

1. In a new terminal window, run this command to unzip the `.csv` files:

```bash
unzip crypto_sample.zip
```

1. In Terminal, navigate to `<local folder>` and connect to your $SERVICE_SHORT.
```bash
psql -d "postgres://<username>:<password>@<host>:<port>/<database-name>"
```
The connection information for a $SERVICE_SHORT is available in the file you downloaded when you created it.

1. At the `psql` prompt, use the `COPY` command to transfer data into your
Timescale instance. If the `.csv` files aren't in your current directory,
Expand All @@ -44,3 +44,6 @@ dataset and copy the data to your database.
</Procedure>

[twelve-data]: https://twelvedata.com/
[migrate-with-downtime]: /migrate/:currentVersion:/pg-dump-and-restore/
[migrate-live]: /migrate/:currentVersion:/live-migration/
[data-ingest]: /use-timescale/:currentVersion:/ingest-data/
4 changes: 0 additions & 4 deletions _partials/_add-data-twelvedata-stocks.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,11 @@
This tutorial uses real-time stock trade data, also known as tick data, from
[Twelve Data][twelve-data]. A direct download link is provided below.

## Ingest the dataset

To ingest data into the tables that you created, you need to download the
dataset and copy the data to your database.

<Procedure>

#### Ingesting the dataset

1. Download the `real_time_stock_data.zip` file. The file contains two `.csv`
files; one with company information, and one with real-time stock trades for
the past month. Download:
Expand Down
18 changes: 9 additions & 9 deletions _partials/_caggs-intro.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
Time-series data usually grows very quickly. And that means that aggregating the
data into useful summaries can become very slow. Continuous aggregates makes
aggregating data lightning fast.
In modern applications, data usually grows very quickly. This means that aggregating
it into useful summaries can become very slow. $CLOUD_LONG continuous aggregates make
aggregating data lightning fast, accurate, and easy.

If you are collecting data very frequently, you might want to aggregate your
data into minutes or hours instead. For example, if you have a table of
temperature readings taken every second, you can find the average temperature
data into minutes or hours instead. For example, if an IoT device takes
temperature readings every second, you might want to find the average temperature
for each hour. Every time you run this query, the database needs to scan the
entire table and recalculate the average every time.
entire table and recalculate the average.

Continuous aggregates are a kind of hypertable that is refreshed automatically
in the background as new data is added, or old data is modified. Changes to your
Expand All @@ -21,10 +21,10 @@ means that you can get on with working your data instead of maintaining your
database.

Because continuous aggregates are based on hypertables, you can query them in
exactly the same way as your other tables, and enable [Hypercore][hypercore]
or [tiered storage][data-tiering] on your continuous aggregates. You can even
exactly the same way as your other tables, and enable [compression][compression]
or [tiered storage][data-tiering] on them. You can even
create
[continuous aggregates on top of your continuous aggregates][hierarchical-caggs].
[continuous aggregates on top of your continuous aggregates][hierarchical-caggs] - for an even more fine-tuned aggregation.

By default, querying continuous aggregates provides you with real-time data.
Pre-aggregated data from the materialized view is combined with recent data that
Expand Down
4 changes: 1 addition & 3 deletions _partials/_cloud-create-connect-tutorials.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,7 @@ command-line utility. If you've used PostgreSQL before, you might already have

<Procedure>

### Create a Timescale service and connect to the service

1. In the [Timescale portal][timescale-portal], click `Create service`.
1. In the [$CONSOLE][timescale-portal], click `Create service`.
1. Click `Download the cheatsheet` to download an SQL file that contains the
login details for your new service. You can also copy the details directly
from this page. When you have copied your password,
Expand Down
4 changes: 1 addition & 3 deletions _partials/_create-hypertable-blockchain.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@

## Create a hypertable
## Optimize time-series data in hypertables

Hypertables are the core of Timescale. Hypertables enable Timescale to work
efficiently with time-series data. Because Timescale is PostgreSQL, all the
Expand All @@ -9,8 +9,6 @@ with Timescale tables similar to standard PostgreSQL.

<Procedure>

### Creating a hypertable

1. Create a standard PostgreSQL table to store the Bitcoin blockchain data
using `CREATE TABLE`:

Expand Down
4 changes: 1 addition & 3 deletions _partials/_create-hypertable-energy.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
## Create a hypertable
## Optimize time-series data in hypertables

Hypertables are the core of Timescale. Hypertables enable Timescale to work
efficiently with time-series data. Because Timescale is PostgreSQL, all the
Expand All @@ -8,8 +8,6 @@ with Timescale tables similar to standard PostgreSQL.

<Procedure>

### Creating a hypertable

1. Create a standard PostgreSQL table to store the energy consumption data
using `CREATE TABLE`:

Expand Down
Loading