Importing one million rows into a database using Laravel/PHP is a task that must be carefully handled to avoid running into performance issues, memory overflows, or long execution times.
Here’s a complete guide on how to efficiently import 1 million records into a database using Laravel:
🧠 Key Considerations Before You Begin
- 🚫 Avoid inserting all rows at once — this leads to memory exhaustion or timeouts.
- ✅ Use batch inserts (in chunks).
- ✅ Disable unnecessary features like Eloquent events, timestamps, etc.
- ✅ Consider queueing or job dispatching if working with large files.
- ✅ Use raw queries or the Query Builder for better performance than Eloquent.
🧱 Scenario: You Want to Import 1 Million Rows from a CSV File
We’ll walk through importing a large .csv file of user data with columns: name, email, and created_at.
✅ Step-by-Step Efficient Import Strategy
🔧 Step 1: Prepare Your Model and Migration
Let’s assume a users table already exists:
php artisan make:model User -m
Example migration:
Schema::create('users', function (Blueprint $table) {
$table->id();
$table->string('name');
$table->string('email')->unique();
$table->timestamps();
});
Run it:
php artisan migrate
📄 Step 2: Create a CSV Import Command
Use Artisan to create a custom CLI command:
php artisan make:command ImportUsers
⚙️ Step 3: Code the Import Logic
Open app/Console/Commands/ImportUsers.php and modify:
use Illuminate\Console\Command;
use Illuminate\Support\Facades\DB;
use Carbon\Carbon;
class ImportUsers extends Command
{
protected $signature = 'import:users {--file= : CSV file path}';
protected $description = 'Import users from a CSV file';
public function handle()
{
$filePath = $this->option('file');
if (!file_exists($filePath)) {
$this->error("File not found at $filePath");
return 1;
}
$this->info("Starting import...");
$handle = fopen($filePath, 'r');
// Skip header
fgetcsv($handle);
$batchSize = 1000;
$batchData = [];
$i = 0;
while (($row = fgetcsv($handle)) !== false) {
$i++;
$batchData[] = [
'name' => $row[0],
'email' => $row[1],
'created_at' => Carbon::now(),
'updated_at' => Carbon::now(),
];
if (count($batchData) === $batchSize) {
DB::table('users')->insert($batchData);
$this->output->write('.');
$batchData = [];
}
}
// Insert remaining
if (!empty($batchData)) {
DB::table('users')->insert($batchData);
}
fclose($handle);
$this->info("\nImport completed. Total rows processed: $i");
return 0;
}
}
🚀 Step 4: Prepare Your CSV File
Make sure your CSV file (users.csv) has the following structure:
name,email
John Doe,john@example.com
Jane Smith,jane@example.com
...
Use a generator script (optional) to create 1 million dummy rows.
🧪 Step 5: Run the Import
From the terminal:
php artisan import:users --file=/path/to/users.csv
This will insert 1000 users per batch until all 1 million rows are imported.
📈 Performance Tips
-
Use
DB::table()->insert()instead of Eloquent — avoids model instantiation. -
Disable query logging if importing in Tinker or web:
DB::disableQueryLog(); -
Wrap in transaction if needed:
DB::beginTransaction(); try { // insert batches DB::commit(); } catch (\Exception $e) { DB::rollBack(); } - Index your tables for faster inserts and lookups.
- Use Laravel Horizon/Queue to handle importing in jobs if needed.
🧰 Optional: Use Laravel Excel or Spatie Simple Excel
Laravel Excel or Spatie’s Simple Excel are packages optimized for large Excel/CSV imports:
With Spatie Simple Excel (Very Lightweight):
composer require spatie/simple-excel
Example:
use Spatie\SimpleExcel\SimpleExcelReader;
SimpleExcelReader::create('/path/to/users.csv')
->useDelimiter(',')
->getRows()
->chunk(1000)
->each(function (Collection $rows) {
$data = $rows->map(fn ($row) => [
'name' => $row['name'],
'email' => $row['email'],
'created_at' => now(),
'updated_at' => now(),
])->toArray();
DB::table('users')->insert($data);
});
🛑 What to Avoid
-
Inserting all at once (
insert($allRows)) — will crash or timeout. -
Using Eloquent inside a loop:
foreach ($rows as $row) { User::create($row); // ❌ Too slow for large datasets } - Calling jobs for every row unless you queue them properly.
🧾 Summary
| Method | Best For | Performance |
|---|---|---|
| Raw DB insert (chunked) | Fastest insert | ✅✅✅ |
| Eloquent in loop | Reads well but slow for 1M rows | ❌ |
| Laravel Excel / Spatie Excel | Clean syntax, works well | ✅✅ |
| Queued Jobs | Async handling | ✅✅✅ (with setup) |
🎉 Conclusion
Importing a million rows into your Laravel app is totally feasible — as long as you do it efficiently. Stick to batch inserts, avoid Eloquent when possible, and use tools like Spatie Excel if you prefer a clean interface.
You can scale this method to 10 million rows or more with the same approach.
Would you like a generator script for creating a large test CSV or want this import handled in queue jobs for background processing? Let me know — I’ll help set it up!

Leave a Reply