Spring Batch with Quartz Scheduler
A comprehensive Spring Boot application featuring Spring Batch jobs with Quartz scheduling, PostgreSQL persistence, and REST API for job management.
Features
- Spring Batch Jobs:
sampleJob- Simple demonstration jobdataProcessingJob- User data processing from databasefileToDbJob- Read CSV file and persist to databasedbToFileJob- Read from database and export to CSV file
- Quartz Scheduler: Schedule jobs with cron expressions
- REST API: Manage and execute jobs on-demand
- PostgreSQL: Job metadata and data persistence
- File Processing: Configurable input/output directories
Prerequisites
- Java 17 or higher
- Maven 3.6+
- PostgreSQL 12+
- Docker (optional, for PostgreSQL)
Database Setup
Option 1: Using Docker
bash
docker run --name postgres-batch \
-e POSTGRES_USER=postgres \
-e POSTGRES_PASSWORD=postgres \
-e POSTGRES_DB=batchdb \
-p 5432:5432 \
-d postgres:14Option 2: Local PostgreSQL
Create a database named batchdb:
sql
CREATE DATABASE batchdb;Configuration
Edit src/main/resources/application.properties:
properties
# Database Configuration
spring.datasource.url=jdbc:postgresql://localhost:5432/batchdb
spring.datasource.username=postgres
spring.datasource.password=postgres
# File Processing Paths
batch.file.input.directory=/tmp/batch/input
batch.file.output.directory=/tmp/batch/output
batch.file.input.filename=input-data.csv
batch.file.output.filename=output-data.csvSetup File Processing
- Create input/output directories:
bash
mkdir -p /tmp/batch/input
mkdir -p /tmp/batch/output- Copy sample input file:
bash
cp sample-input-data.csv /tmp/batch/input/input-data.csvBuild and Run
bash
# Build the project
mvn clean install
# Run the application
mvn spring-boot:runThe application will start on http://localhost:8080
REST API Endpoints
List All Jobs
bash
GET http://localhost:8080/api/jobsRun Job On-Demand
bash
POST http://localhost:8080/api/jobs/{jobName}/runExample:
bash
curl -X POST http://localhost:8080/api/jobs/fileToDbJob/run
curl -X POST http://localhost:8080/api/jobs/dbToFileJob/runGet Job Executions
bash
# All executions
GET http://localhost:8080/api/jobs/executions
# Specific job executions
GET http://localhost:8080/api/jobs/{jobName}/executions
# Single execution details
GET http://localhost:8080/api/jobs/executions/{executionId}Schedule Job with Cron
bash
POST http://localhost:8080/api/jobs/{jobName}/schedule
Content-Type: application/json
{
"cronExpression": "0 0/5 * * * ?"
}Example:
bash
# Schedule fileToDbJob to run every 5 minutes
curl -X POST http://localhost:8080/api/jobs/fileToDbJob/schedule \
-H "Content-Type: application/json" \
-d '{"cronExpression": "0 0/5 * * * ?"}'
# Schedule dbToFileJob to run daily at 2 AM
curl -X POST http://localhost:8080/api/jobs/dbToFileJob/schedule \
-H "Content-Type: application/json" \
-d '{"cronExpression": "0 0 2 * * ?"}'Unschedule Job
bash
DELETE http://localhost:8080/api/jobs/{jobName}/schedulePause/Resume Job
bash
POST http://localhost:8080/api/jobs/{jobName}/pause
POST http://localhost:8080/api/jobs/{jobName}/resumeGet Next Fire Time
bash
GET http://localhost:8080/api/jobs/{jobName}/next-fire-timeCommon Cron Expressions
0 0/5 * * * ?- Every 5 minutes0 0 * * * ?- Every hour0 0 2 * * ?- Daily at 2 AM0 0 12 * * ?- Daily at noon0 0 12 * * MON-FRI- Weekdays at noon0 0 0 1 * ?- First day of every month at midnight
File Processing Jobs
fileToDbJob
Reads CSV files from the configured input directory and persists records to the employees table.
Input File Format (CSV):
csv
employeeId,firstName,lastName,email,department,salary
E001,John,Doe,john.doe@example.com,IT,75000.00dbToFileJob
Reads all records from the employees table and exports them to a CSV file in the output directory.
Output File: Created at {output.directory}/{output.filename}
Project Structure
src/main/java/com/example/batchscheduler/
├── config/
│ ├── BatchConfiguration.java # Sample batch jobs
│ └── FileBatchConfiguration.java # File processing jobs
├── controller/
│ └── JobController.java # REST API endpoints
├── dto/
│ ├── JobExecutionInfo.java # Job execution DTO
│ └── ScheduleRequest.java # Schedule request DTO
├── entity/
│ └── Employee.java # Employee entity
├── job/
│ └── BatchJobRunner.java # Quartz job runner
├── model/
│ └── User.java # User model
├── repository/
│ └── EmployeeRepository.java # JPA repository
├── service/
│ ├── BatchJobService.java # Job execution service
│ └── JobSchedulerService.java # Scheduler service
└── SpringBatchSchedulerApplication.java # Main applicationDatabase Tables
The application automatically creates:
- Spring Batch metadata tables (BATCH_*)
- Quartz scheduler tables (QRTZ_*)
- Employee table for file processing
- User table (if needed)
Troubleshooting
Issue: File not found
- Ensure input directory exists:
/tmp/batch/input - Verify file is named correctly:
input-data.csv - Check file permissions
Issue: Database connection error
- Verify PostgreSQL is running
- Check credentials in
application.properties - Ensure database
batchdbexists
Issue: Job already running
- Spring Batch prevents concurrent execution of the same job instance
- Wait for current execution to complete or use different job parameters
Monitoring
View job execution history:
bash
curl http://localhost:8080/api/jobs/executions | json_ppCheck scheduled jobs:
bash
curl http://localhost:8080/api/jobsTesting the File Processing Flow
- Prepare input file:
bash
cp sample-input-data.csv /tmp/batch/input/input-data.csv- Run fileToDbJob:
bash
curl -X POST http://localhost:8080/api/jobs/fileToDbJob/run- Verify data in database:
sql
SELECT * FROM employees;- Export data to file:
bash
curl -X POST http://localhost:8080/api/jobs/dbToFileJob/run- Check output file:
bash
cat /tmp/batch/output/output-data.csvLicense
This project is open source and available under the MIT License.
No comments:
Post a Comment