perf(api): Implement batch operations for bulk actions #204

Closed
opened 2026-01-24 17:14:52 +00:00 by jack · 0 comments
Owner

Problem

Bulk-Operationen werden aktuell einzeln ausgeführt:

// DataRouter.bulkDeleteObservations()
for (const id of idList) {
  const deleted = await this.deps.observations.delete(id);  // N DELETE queries
}

Für 1000 Observations = 1000 separate DELETE Queries!

Lösung

1. Batch Delete

// ObservationRepository
async batchDelete(ids: number[]): Promise<number> {
  if (ids.length === 0) return 0;
  
  // Chunk große Arrays (SQLite hat Limit für IN clause)
  const chunks = chunk(ids, 500);
  let totalDeleted = 0;
  
  for (const chunk of chunks) {
    const result = await this.em.nativeDelete(Observation, { 
      id: { $in: chunk } 
    });
    totalDeleted += result;
  }
  
  return totalDeleted;
}

2. Batch Update

// TaskRepository
async batchUpdateStatus(ids: string[], status: TaskStatus): Promise<number> {
  return this.em.nativeUpdate(Task, 
    { id: { $in: ids } },
    { status, completedAt: Date.now() }
  );
}

3. Batch Insert

// ObservationRepository
async batchCreate(observations: CreateObservationInput[]): Promise<Observation[]> {
  const entities = observations.map(o => this.em.create(Observation, o));
  await this.em.persistAndFlush(entities);
  return entities;
}

4. Refactored API Endpoints

// DELETE /api/data/observations/bulk
router.delete('/observations/bulk', async (req, res) => {
  const { ids } = req.body;
  
  // Validierung
  if (!Array.isArray(ids) || ids.length > 10000) {
    return res.status(400).json({ error: 'Invalid or too many IDs' });
  }
  
  const deleted = await observations.batchDelete(ids);
  
  // Cache invalidieren
  cache.invalidate('observations:');
  
  res.json({ deleted });
});

Auswirkung

Operation Vorher Nachher
Delete 1000 obs 1000 queries, ~10s 2 queries, ~100ms
Update 500 tasks 500 queries, ~5s 1 query, ~50ms

Neue Endpoints

DELETE /api/data/observations/bulk
Body: { ids: number[] }

PATCH /api/data/tasks/bulk
Body: { ids: string[], status: string }

POST /api/data/observations/bulk
Body: { observations: CreateObservationInput[] }

Akzeptanzkriterien

  • Batch methods in allen relevanten Repositories
  • Chunking für große Arrays (SQLite Limits)
  • Neue Bulk-API Endpoints
  • Transaktionen für Atomizität
  • Validierung der Array-Größe
## Problem Bulk-Operationen werden aktuell einzeln ausgeführt: ```typescript // DataRouter.bulkDeleteObservations() for (const id of idList) { const deleted = await this.deps.observations.delete(id); // N DELETE queries } ``` **Für 1000 Observations = 1000 separate DELETE Queries!** ## Lösung ### 1. Batch Delete ```typescript // ObservationRepository async batchDelete(ids: number[]): Promise<number> { if (ids.length === 0) return 0; // Chunk große Arrays (SQLite hat Limit für IN clause) const chunks = chunk(ids, 500); let totalDeleted = 0; for (const chunk of chunks) { const result = await this.em.nativeDelete(Observation, { id: { $in: chunk } }); totalDeleted += result; } return totalDeleted; } ``` ### 2. Batch Update ```typescript // TaskRepository async batchUpdateStatus(ids: string[], status: TaskStatus): Promise<number> { return this.em.nativeUpdate(Task, { id: { $in: ids } }, { status, completedAt: Date.now() } ); } ``` ### 3. Batch Insert ```typescript // ObservationRepository async batchCreate(observations: CreateObservationInput[]): Promise<Observation[]> { const entities = observations.map(o => this.em.create(Observation, o)); await this.em.persistAndFlush(entities); return entities; } ``` ### 4. Refactored API Endpoints ```typescript // DELETE /api/data/observations/bulk router.delete('/observations/bulk', async (req, res) => { const { ids } = req.body; // Validierung if (!Array.isArray(ids) || ids.length > 10000) { return res.status(400).json({ error: 'Invalid or too many IDs' }); } const deleted = await observations.batchDelete(ids); // Cache invalidieren cache.invalidate('observations:'); res.json({ deleted }); }); ``` ## Auswirkung | Operation | Vorher | Nachher | |-----------|--------|---------| | Delete 1000 obs | 1000 queries, ~10s | 2 queries, ~100ms | | Update 500 tasks | 500 queries, ~5s | 1 query, ~50ms | ## Neue Endpoints ``` DELETE /api/data/observations/bulk Body: { ids: number[] } PATCH /api/data/tasks/bulk Body: { ids: string[], status: string } POST /api/data/observations/bulk Body: { observations: CreateObservationInput[] } ``` ## Akzeptanzkriterien - [ ] Batch methods in allen relevanten Repositories - [ ] Chunking für große Arrays (SQLite Limits) - [ ] Neue Bulk-API Endpoints - [ ] Transaktionen für Atomizität - [ ] Validierung der Array-Größe
jack closed this issue 2026-01-25 00:00:08 +00:00
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
customable/claude-mem#204
No description provided.