Tuesday, October 14, 2025

Spring Batch Sample Project

 

Spring Batch with Quartz Scheduler

A comprehensive Spring Boot application featuring Spring Batch jobs with Quartz scheduling, PostgreSQL persistence, and REST API for job management.

Features

  • Spring Batch Jobs:
    • sampleJob - Simple demonstration job
    • dataProcessingJob - User data processing from database
    • fileToDbJob - Read CSV file and persist to database
    • dbToFileJob - Read from database and export to CSV file
  • Quartz Scheduler: Schedule jobs with cron expressions
  • REST API: Manage and execute jobs on-demand
  • PostgreSQL: Job metadata and data persistence
  • File Processing: Configurable input/output directories

Prerequisites

  • Java 17 or higher
  • Maven 3.6+
  • PostgreSQL 12+
  • Docker (optional, for PostgreSQL)

Database Setup

Option 1: Using Docker

bash
docker run --name postgres-batch \
  -e POSTGRES_USER=postgres \
  -e POSTGRES_PASSWORD=postgres \
  -e POSTGRES_DB=batchdb \
  -p 5432:5432 \
  -d postgres:14

Option 2: Local PostgreSQL

Create a database named batchdb:

sql
CREATE DATABASE batchdb;

Configuration

Edit src/main/resources/application.properties:

properties
# Database Configuration
spring.datasource.url=jdbc:postgresql://localhost:5432/batchdb
spring.datasource.username=postgres
spring.datasource.password=postgres

# File Processing Paths
batch.file.input.directory=/tmp/batch/input
batch.file.output.directory=/tmp/batch/output
batch.file.input.filename=input-data.csv
batch.file.output.filename=output-data.csv

Setup File Processing

  1. Create input/output directories:
bash
mkdir -p /tmp/batch/input
mkdir -p /tmp/batch/output
  1. Copy sample input file:
bash
cp sample-input-data.csv /tmp/batch/input/input-data.csv

Build and Run

bash
# Build the project
mvn clean install

# Run the application
mvn spring-boot:run

The application will start on http://localhost:8080

REST API Endpoints

List All Jobs

bash
GET http://localhost:8080/api/jobs

Run Job On-Demand

bash
POST http://localhost:8080/api/jobs/{jobName}/run

Example:

bash
curl -X POST http://localhost:8080/api/jobs/fileToDbJob/run
curl -X POST http://localhost:8080/api/jobs/dbToFileJob/run

Get Job Executions

bash
# All executions
GET http://localhost:8080/api/jobs/executions

# Specific job executions
GET http://localhost:8080/api/jobs/{jobName}/executions

# Single execution details
GET http://localhost:8080/api/jobs/executions/{executionId}

Schedule Job with Cron

bash
POST http://localhost:8080/api/jobs/{jobName}/schedule
Content-Type: application/json

{
  "cronExpression": "0 0/5 * * * ?"
}

Example:

bash
# Schedule fileToDbJob to run every 5 minutes
curl -X POST http://localhost:8080/api/jobs/fileToDbJob/schedule \
  -H "Content-Type: application/json" \
  -d '{"cronExpression": "0 0/5 * * * ?"}'

# Schedule dbToFileJob to run daily at 2 AM
curl -X POST http://localhost:8080/api/jobs/dbToFileJob/schedule \
  -H "Content-Type: application/json" \
  -d '{"cronExpression": "0 0 2 * * ?"}'

Unschedule Job

bash
DELETE http://localhost:8080/api/jobs/{jobName}/schedule

Pause/Resume Job

bash
POST http://localhost:8080/api/jobs/{jobName}/pause
POST http://localhost:8080/api/jobs/{jobName}/resume

Get Next Fire Time

bash
GET http://localhost:8080/api/jobs/{jobName}/next-fire-time

Common Cron Expressions

  • 0 0/5 * * * ? - Every 5 minutes
  • 0 0 * * * ? - Every hour
  • 0 0 2 * * ? - Daily at 2 AM
  • 0 0 12 * * ? - Daily at noon
  • 0 0 12 * * MON-FRI - Weekdays at noon
  • 0 0 0 1 * ? - First day of every month at midnight

File Processing Jobs

fileToDbJob

Reads CSV files from the configured input directory and persists records to the employees table.

Input File Format (CSV):

csv
employeeId,firstName,lastName,email,department,salary
E001,John,Doe,john.doe@example.com,IT,75000.00

dbToFileJob

Reads all records from the employees table and exports them to a CSV file in the output directory.

Output File: Created at {output.directory}/{output.filename}

Project Structure

src/main/java/com/example/batchscheduler/
├── config/
│   ├── BatchConfiguration.java          # Sample batch jobs
│   └── FileBatchConfiguration.java      # File processing jobs
├── controller/
│   └── JobController.java               # REST API endpoints
├── dto/
│   ├── JobExecutionInfo.java           # Job execution DTO
│   └── ScheduleRequest.java            # Schedule request DTO
├── entity/
│   └── Employee.java                    # Employee entity
├── job/
│   └── BatchJobRunner.java             # Quartz job runner
├── model/
│   └── User.java                        # User model
├── repository/
│   └── EmployeeRepository.java         # JPA repository
├── service/
│   ├── BatchJobService.java            # Job execution service
│   └── JobSchedulerService.java        # Scheduler service
└── SpringBatchSchedulerApplication.java # Main application

Database Tables

The application automatically creates:

  • Spring Batch metadata tables (BATCH_*)
  • Quartz scheduler tables (QRTZ_*)
  • Employee table for file processing
  • User table (if needed)

Troubleshooting

Issue: File not found

  • Ensure input directory exists: /tmp/batch/input
  • Verify file is named correctly: input-data.csv
  • Check file permissions

Issue: Database connection error

  • Verify PostgreSQL is running
  • Check credentials in application.properties
  • Ensure database batchdb exists

Issue: Job already running

  • Spring Batch prevents concurrent execution of the same job instance
  • Wait for current execution to complete or use different job parameters

Monitoring

View job execution history:

bash
curl http://localhost:8080/api/jobs/executions | json_pp

Check scheduled jobs:

bash
curl http://localhost:8080/api/jobs

Testing the File Processing Flow

  1. Prepare input file:
bash
cp sample-input-data.csv /tmp/batch/input/input-data.csv
  1. Run fileToDbJob:
bash
curl -X POST http://localhost:8080/api/jobs/fileToDbJob/run
  1. Verify data in database:
sql
SELECT * FROM employees;
  1. Export data to file:
bash
curl -X POST http://localhost:8080/api/jobs/dbToFileJob/run
  1. Check output file:
bash
cat /tmp/batch/output/output-data.csv

License

This project is open source and available under the MIT License.