batch¶
You can use batch
clauses with the following queries:
Example¶
By default, a query will return as many rows as you ask it for. The problem is when you have a table containing millions of rows - you might not want to load them all into memory at once. To get around this, you can batch the responses.
# Returns 100 rows at a time:
async with await Manager.select().batch(batch_size=100) as batch:
async for _batch in batch:
print(_batch)
Node¶
If you’re using extra_nodes
with PostgresEngine
,
you can specify which node to query:
# Returns 100 rows at a time from read_replica_db
async with await Manager.select().batch(
batch_size=100,
node="read_replica_db",
) as batch:
async for _batch in batch:
print(_batch)
Synchronous version¶
There’s currently no synchronous version. However, it’s easy enough to achieve:
async def get_batch():
async with await Manager.select().batch(batch_size=100) as batch:
async for _batch in batch:
print(_batch)
from piccolo.utils.sync import run_sync
run_sync(get_batch())