Everything you need to know about laravel required validations

One of the strong part of the laravel core is the set of validations which validate your application's incoming data. One of the most common validation rule which is used in applications is required. We will see how can we make use of different types of required rules as well as build custom required rules conditionally.

  • Required rule :

As per the laravel's documentation. when we use required rule, the field under validation must be present and also it should be non-empty. Empty includes null value, empty string, empty array and uploaded file with no path.

In below request both first_name and last_name should be present and non-empty.

$request->validate([
    'first_name' => 'required|unique:posts|max:255',
    'last_name' => 'required',
]);
  • Required rule based on other/supporting field(s) :

Many times when designing complex and nested forms, we need to have required validation only if an other incoming data field is present. Now there are multiple cases in this :

  1. When you want just single supporting field to be present, then you can use required_with like below :

In below request the city, state, zipcode and country will be required if address_line is present in the incoming data.

$request->validate([
    'address_line' => 'sometimes|boolean',
    'city' => 'required_with:address_line',
    'state' => 'required_with:address_line',
    'zipcode' => 'required_with:address_line',
    'country' => 'required_with:address_line'
]);
  1. When you want any one of the supporting set of fields just to be present, then you can use required_with and pass multiple fields like below :

In below request the we want to set validation rule for the notify_new_updates field. We can not notify if we do not have either of user phone number, email or fax. So the validation required_with:phone,email,fax signifies that notify_new_updates will be required if one of the field phone, email or fax is present.

$request->validate([
    'name' => 'required|string',
    'phone' => 'sometimes|string',
    'email' => 'sometimes|email',
    'fax' => 'sometimes|string',
    'notify_new_updates' => 'boolean|required_with:phone,email,fax'
]);
  1. When you want all of the supporting set of fields to be present, then you can use required_with_all and pass multiple fields like below :

In below request the validation required_with_all:phone,email,fax signifies that notify_new_updates will be required if all the field phone, email and fax are present.

$request->validate([
    'name' => 'required|string',
    'phone' => 'sometimes|string',
    'email' => 'sometimes|email',
    'fax' => 'sometimes|string',
    'notify_new_updates' => 'boolean|required_with_all:phone,email,fax'
]);
  1. When you want any one of the supporting set of fields NOT to be present, then you can use required_without and pass one or more fields like below :

In below request the use_billing_as_shipping will be required when shipping_address is NOT present.

$request->validate([
    'billing_address' => 'required|string',
    'shipping_address' => 'sometimes|string',
    'use_billing_as_shipping' => 'boolean|required_without:shipping_address'
]);

Similarly you can use required_without_all where it will need sll of the supporting set of fields NOT to be present.

  1. When you want the other field not only to be present but also it should have a very specific value, then you can use required_if like below

In below request the admin_notification_email will be required if and only if the is_admin field is present having value of 1.

$request->validate([
    'is_admin' => 'required|boolean',
    'admin_notification_email' => 'required_if:is_admin,1|email',
]);
  1. When you want the other field not only to be present but it should have a very specific value, then you can use required_if like below

In below request the displayname will be required unless the nickname field is empty.

$request->validate([
    'nickname' => 'sometimes|string',
    'displayname' => 'required_unless:nickname,',
]);
  • Custom and complex required if validation rules :

We sometimes need to have complex set of conditional rule for required if. With laravel it's much easier than you could have imagined.

Laravel's Rule facade has Rule::requiredIf() method which we can use for this purpose. The beautify of this method is, it takes a boolean value or a closure which return a boolean value. This gives us lot of flexibility to have complex logic for required if rule.

  1. requiredIf() with simple boolean :

Below example will make employee_id field required only if current user is an employee and the employee's company is still active.

$request->validate([
    'employee_id' => Rule::requiredIf($request->user()->is_employee && $request->user()->employee->company->is_active)
]);
  1. requiredIf() with closure :

Below example will make golden_discount_voucher field required only if current user is a customer and has orders to fillfull criteria that the customer belongs to golden account category.

$request->validate([
    'golden_discount_voucher' => Rule::requiredIf(function() use($request){

        return  $request->user()->is_customer &&
                $request->user()->customer->orders->where('orders.grand_total', '>', '1000')->count() > 10;
    })
]);

.....

Read full article

Tags: laravel, validation, required, required_if, required_unless, required_with, required_with_all, required_without, required_without_all, custom, if, with, unless

Composer stuck at Something's changed looking at all rules again

While upgrading one of my laravel application from version 5.3 to 5.4, I came across a weird situation which was coming from composer dependency management. The situation was that when running composer install it was getting stuck at Something's changed looking at all rules again(1) and composer was showing counter of attempts in brackets which was increasing from 1 to 900+. It took arund 2+ hours composer update was stuck in this situation and attempts counter was increasing, even when the system where application was hosted had pretty good configurations.

  • How do I get this information :

When you run composer command, lets say composer install, it does internal steps it's supposed to do one by one. However, we just see the high level output in command promt and not the details of steps. Similarly, the error message I saw can not be seen directly if you just do composer install. You need to make the output more verbose by passing -v command line argument. You can pass upto 3 v's to get more verbosity

For e.g. :

`composer install -vvv`

When I did this, I got lot of detailed steps whats composer is doing internally, one of these verbose messages was the error message Something's changed looking at all rules again(1)

  • Why composer gets stuck :

Okay, to explain it's comparatively easy to understand why it gets stuck. Let's take an example that application has 2 packages techsemicolon/package-first and techsemicolon/package-second:

"require": {
    "php": ">=5.6.4",
    "techsemicolon/package-first": "5.4.*",
    "techsemicolon/package-second": "2.3.*",
    .
    .
    .
    .
}

Now the both these packages have package techsemicolon/dependency-package in their own dependencies. But

`techsemicolon/package-first` requires `techsemicolon/dependency-package` at version `2.*.*` 
`techsemicolon/package-second` requires `techsemicolon/dependency-package` at version `3.*.*`. 

Now, it's really confusing for composer to now decide if it should install the techsemicolon/dependency-package at version 2.*.* or 3.*.*. When we do composer install or update, it calculates these dependency trees and decide which version to install/update to. It checks the dependency rules defined in composer.json of application as well as those in individual package as their dependencies. Hence it gets stuck at Something's changed looking at all rules again.

Now, there can be situation where it recalculatues the rules again and shows this message but within a minute or two it finds the version which can satisfy the dependencies and proceeds. But in cases like mine, it calculates the rules in circles and gets stuck for really really long time.

  • The solution :

A note to start with, this is not the only or best solution to resolve this. But, considering this error is little different than commonly occuring composer errors, this is what I could do to resolve this quickly and effectively.

  1. Let's consider your composer has 15 packages in require dependency section of your app's main composer.json. I followed an approach which is very neive, I divided those 15 depencies into 3 sections of 5. I commented out first 5 and ran composer install. It if still gets stuck at error, then I commented out next 5 and uncommented first 5 and ran composer install again. I kept doing this until I found a faulty batch which if I comment out then composer install works properly. I kept composer install with verbosity of --vvvso it shows detailed output.

    Then you will have n number of packages from the commented out batch, 5 in my case.

  2. Then I went to https://packagist.org and went to each of that package from the faulty batch. I checked their internal dependencies or requirements of php version etc. Cross checked with other packages in the same batch if there is a conflict of version.

  3. But I still did not find the dependency conflict. So I came back to my app where composer install worked when I commented out the failty batch. Then I did composer show --tree. That time I found that :

`barryvdh/laravel-dompdf` required `dompdf/dompdf` at version `^0.8` 
`laravel-datatables-oracle` required `dompdf/dompdf` at version `^0.7`

I had to upgrade version of laravel-datatables-oracle so that its dependency of dompdf/dompdf is at version ^0.8 which matches with barryvdh/laravel-dompdf's installed version.

Don't worry, once you do this in your own app you will figure it out that it's not that hard. I freaked out myself when I saw the error for the first time.

  • Quick note on composer diagnosis :

You can run composer diagnose which gives you suggestions and warnings to improve the composer.json package version calculations. e.g.

require.barryvdh/laravel-debugbar : exact version constraints (2.3.2) should be avoided if the package follows semantic versioning
require.webmozart/assert : exact version constraints (1.2) should be avoided if the package follows semantic versioning

.....

Read full article

Tags: laravel, composer, hangs, stuck, Something, changed, rules, package, dependency

Laravel content-length header issue with gzip compression

As an optimization process from server side responses, we use gzip compression which compresses the response before sending it to the client. Compressing responses as a result significantly reduces the size of data being transmitted.

As you can imagine any type of compression requires certain amount of memory and processing power. Considering the gzip compression happens at runtime, it consumes considerable processing resources. Which is why it is important to configure the compression for the responses which really need it.

  • Configuring compression :

An application web server has different types of responses, some might be extremely small having size of 10byte or some with very large data having size more than 1MB. It is fairly good to consider that the 10byte response does not need compression as it's already very very tiny in size. If your server is attempting to compresses such smal responses as well, its consuming unnccessary resources in response compression.

To avoid this, we use gzip directives to notify gzip compression mechanism that only compress the responses which have size greater then the defined directive value. We use gzip_min_length directive for it which takes numeric value representing bytes.

gzip_min_length 1000;

As per above gzip configuration, any response having size less than 1000 bytes will be transmitted to the client as it is without any compression. Any response having size greater than 1000 bytes will be compressed first at runtime and then transmitted to the client.

Note : The value of 1000 bytes is taken just as an example for this article. Please change it as per your requirements.

  • Now here is the catch :

Even if you have the gzip_min_length directive value set, it under the hood relies on Content-Length header present in response.

The gzip module during the compression determines the length only from the “Content-Length” response header field

So, if your app server is NOT having Content-Length header, gzip_min_length directive is not going to determine the response size and hence it will default to all responses being compressed before their transmittion to client.

This is pretty bad for a server which is serving lot of requests as nginx gzip is utilizing unneccessary resources for compression of responses very tiny in size.

  • The solution :

The solution is fairly simple, we need to add Content-Length in each of our response so that gzip can determine the response size and decide if compression is to be done or not.

As we are focusing on laravel in this article, we can add content length using a simple middleware. Let's call it ContentLegthMiddleware :

<?php

namespace App\Http\Middleware;

use Closure;

class ContentLegthMiddleware
{
    /**
     * Handle an incoming request.
     *
     * @param  \Illuminate\Http\Request  $request
     * @param  \Closure  $next
     * @return mixed
     */
    public function handle($request, Closure $next)
    {
        $response = $next($request);

        // Get response length
        $responseLength = strlen($response->getOriginalContent());

        // Add the header
        $response->header('Content-Length', $responseLength);

        // Return the response
        return $response;
    }
}

Last but not the least, you need to add the middleware in app/Http/Kernel.php.

Now you will be able to see Content-Length: xxx in your dev tools under response headers.

  • Benchmarking :

It is important to see if these changes have made any difference. I used Siege load testing framework to test my server before and after the above changes were done. I could definitely see quicker response times and less CPU utilization. Please note that, this will be significantly seen on a high workload. If you have couple of requests on your server, the difference in performance will be negligible.

  • Quick note on gzip_comp_level :

The gzip compression as another directive called gzip_comp_level which receives values from 1 to 9, where compression increases from 1 to 9. This is another vital directive. If you have comression level, it is going to consume more resources for compressing the response. For most servers value of 2 or 3 is enough as difference between compression from level to level is not significant.

.....

Read full article

Tags: laravel, content, length, header, compression, gzip, nginx, middleware, memory

AWS Load Balancer stickiness and load distribution

AWS load balancing is an interesting cloud service which automatically distributes incoming application traffic across multiple available target servers, such as Amazon EC2 instances, containers, IP addresses, and Lambda functions. It helps the infrastructure to have high availability, automatic scaling and as a result makes application more fault tolerant.

  • Quick question :

As per the concept of AWS load balancer with autoscaling, if the traffic is increased which current servers can not handle, new server is launched automatically and added under the load balancer so that the traffic is distributed across available target servers.

Let's say for an example :

We have an Application load balancer(ALB) which has minimum 2 servers, maximum it can autoscale to 10 servers. Each server can serve user traffic for 50 users.

The server has autoscaling policy based on CPUUtilization, when server goes above 75% for CPUUtilization metrix, autoscaling shoul spin up new instance.

At 8AM there are 50 users, the traffic is distributed across 2 servers and everything is normal. AT 10AM there are 95 user and the CPUUtilization is greater than 75% threashold, a new server spins up.

Now, at 10.15AM can you be sure that the traffic load is distributed evenly by the autoscaling?

  • Let's find out :
  1. Server spin up time :

Server spin up time, also called as instance warmup time, is the duration in which server spin up is initiated and server is ready to server requests. This depends upon the configuration scripts and start up commands which are run when server is spinning up.

The instances use a configuration script to install and configure software before the instance is put into service. As a result, it takes around two or three minutes from the time the instance launches until it comes in service. This is not entirely in our hands and AWS internal infrastructure also contributes to this time. The actual time depends on several factors, such as the size of the instance and whether there are startup scripts to complete.

However, this is important to know how much time your server takes on an average to be ready. Because if your server takes on an average 15 minutes to spin up, all the increased traffic will still served by old overwhelmed servers for that 15 minute period.

AWS uses cooldown period setting for simple autoscaling policy to handle the startup time.

  1. Sticky session or stickyness of the load balancer :

Sticky session or stickyness of load balancer the setting to route the traffic incoming requests for a particular session to the same target server that serviced the initial request for that session. In short, under load balancer having 3 servers S1, S2 and S3, if User A's first request was served by S2 server, his subsqeuent requests will be also served by S2 server until the request stickyness is expired or disabled or deliberatly updated(by sending differnet AWSALB cookie in request)

More interestingly, the load balancer will always distribute traffic in a round robin algorithm. This is done with each request without stickiness and done on each session with stickiness.

As soon as a new instance is spun up and joins the group, it will not immediately get all the traffic. but it will join in the pool of round robin distribution of incoming traffic.

With stickiness setting enabled, existing sessions will still have all requests routed to existing instances. Only new sessions may hit the new instance. Being round robin, new sessions may also get routed to the existing instances.

Without stickiness, request from existing users will immediately get routed to the new instance. Hence it's safe to say that how quickly traffic load will be equilized and distributed across all available instances will depend on stickiness.

Sticky session comes up with a setting of expiration. You can specify a time from 1 second to 7 days. This setting is really important and you should definitely pay attension to the value you are setting. If this value is very large, spinning up new instances when autoscaling kicks in will not be useful as existing traffic is still stick with old servers.

Also a key point to note that for sticky session expiration Period, type the cookie expiration period, in seconds. If you do not specify an expiration period, the sticky session lasts for the duration of the browser session.

  • Know if you need sticky sessions :

Even if the sticky session setting is a good choice, it's for the applications which maintain the session state on service target instance. For example, a php web server maintaining sessions in local filesystem of EC2 instance. In that case if user request is served by other EC2 instance, then user will be logged out die to that session not being present there.

However, if your session state is managed by a separate service like RDS, Redis, Elasticache etc which is independent of which target server is serving your request, you probably do not even need sticky sessions.

  • The bottom line :

Test and find the best settings as per your application when setting up auto-scaling policies. Because, if your server is not ready quickly when the application needs it for serving increased traffic, it's going to affect your application performance specially during peak hours.

.....

Read full article

Tags: aws, loadbalancer, load, balancer, stickiness, sticky, session, cpu, distribution

Laravel use secure SSL connection while using AWS RDS

Amazon Relational Database Service Amazon RDS is a cloud based web service that makes things easier to set up, operate, and scale a relational databases on the cloud. It has become one of the popular choices when setting up laravel database infrastructure.

  • Quick question :

If you are using AWS RDS in your laravel application, is your connection encrypted in transit? Or so ask other way around, is your laravel application connecting to AWS RDS using a secure SSL connection?

If you think that RDS comes up with secure encryption in transit, you are right, it implements SSL. However, is it turned on somehow by default or is it there when we use it directly? Not really.. I had a perception that my laravel app is secured in transit by SSL encryption until I found out its not.

  • Let's find out :

Connect to the environment/server/container where laravel is hosted and run tinker :

php artisan tinker

Once, the tinker prompt is open, run following :

>>> DB::select("SHOW STATUS LIKE 'Ssl_cipher'")

If it gives output like following :

=> [
    {
        +"Variable_name": "Ssl_cipher",
        +"Value": "DHE-RSA-AES128-SHA",
    },
]

Then laravel application is connecting to AWS RDS via a secure SSL connection.

However, if the output is like this :

=> [
    {
        +"Variable_name": "Ssl_cipher",
        +"Value": "",
    },
]

Then the connection is not sure. There are number of variables which you can help us get more information about the SSL connection paramaters. We checked Ssl_cipher above, you can also check Ssl_version which might give you blank or something like TLSv1 if SSL is working.

To get all information about SSL connection run following in tinker prompt :

>>> DB::select("SHOW STATUS LIKE '%Ssl%'")
  • Next steps to secure the connection :

If you found out that the laravel application connection is not using SSL while connecting to AWS RDS, you can follow below steps to enable the same.

Firstly, let us understand how it works. When you connect to AWS RDS normally via mysql cli, you do :

mysql -h myinstance.c9akciq32.rds-us-east-1.amazonaws.com -u username -p

You can pass SSL certificate using --ssl-ca option in above command like below :

mysql -h myinstance.c9akciq32.rds-us-east-1.amazonaws.com --ssl-ca=/path/to/certificate-authority-file.pem -u username -p

Optionally, you can pass -ssl-mode and --ssl-verify-server-cert. For more details about this please refer mysql's official documentation.

Now let's get back to the original problem which we are here to solve. How we are going to do this in Laravel?

Step 1 : Downloading the certificate authority file. AWS RDS has a commonly published pem file called rds-combined-ca-bundle.pem which you can download directly from here. It is an officially published pem file which will work in all default RDS SSL connections.

Step 2 : Save the downloaded file from step 1 inside a new directory called RDSCerts inside laravel root. Quick note that in this step itself, I would add this inside gitignore because there is no need to add pem and cert files inside the version control.

Step 3 : Laravel's database configurations are inside config/database.php file. It already has a mysql section. Let's not change that, lets copy that entirely into a new configuration section called mysql_ssl where we will also add the certification authority file in options like below :

'mysql_ssl' => [
    'driver' => 'mysql',
    'host' => env('DB_HOST', '127.0.0.1'),
    'port' => env('DB_PORT', '3306'),
    'database' => env('DB_DATABASE', 'forge'),
    'username' => env('DB_USERNAME', 'forge'),
    'password' => env('DB_PASSWORD', ''),
    'unix_socket' => env('DB_SOCKET', ''),
    'charset' => 'utf8mb4',
    'collation' => 'utf8mb4_unicode_ci',
    'prefix' => '',
    'prefix_indexes' => true,
    'strict' => false,
    'engine' => null,
    'options' => [    
        PDO::MYSQL_ATTR_SSL_CA => base_path('RDSCerts/rds-combined-ca-bundle.pem')
    ],
],

Important note, You might be thinking that when we need to pass --ssl-verify-server-cert option somehow from laravel's configuration as well. Don't worry it's enabled by default. If you want to disable it then you can pass PDO::MYSQL_ATTR_SSL_VERIFY_SERVER_CERT = false which I would not suggest.

There are more options about SSL which you can check in the official PDO documentation.

Once you follow above 3 steps, you should be good to go. Cross check by running DB::select("SHOW STATUS LIKE '%Ssl%'") in tinker as we did earlier in this article. You should see ciphers and ssl version mentioned in the connection.

.....

Read full article

Tags: laravel, php, aws, rds, ssl, encryption, transit, secure, connection

Laravel finally has higher level orWhere for query scopes

Laravel's excellent querybuilder and Eloquent ORM is handy as ever. But with newer version of laravel, it's much more simpler. 5.8 is here already and this feature has not been talked as much because it seems a minor feature release. Let's dive in with an example :

<?php

namespace App;

use Illuminate\Foundation\Auth\User as Authenticatable;
use Role;

class User extends Authenticatable
{
    /**
     * The attributes that should be hidden for arrays.
     *
     * @var array
     */
    protected $hidden = [

        'password', 'remember_token',
    ];

    /*
     * The attributes that should be mutated to dates.
     *
     * @var array
     */
    protected $dates = ['deleted_at'];

    /**
     * Scope a query to only filter admin users.
     *
     * @param \Illuminate\Database\Eloquent\Builder $query
     * @return \Illuminate\Database\Eloquent\Builder
     */
    public function scopeAdmin($query)
    {
        return $query->where('role', Role::Admin);
    }

    /**
     * Scope a query to only filter reviewer users.
     *
     * @param \Illuminate\Database\Eloquent\Builder $query
     * @return \Illuminate\Database\Eloquent\Builder
     */
    public function scopeReviewer($query)
    {
        return $query->where('role', Role::Reviewer);
    }

    /**
     * Scope a query to only filter customer users.
     *
     * @param \Illuminate\Database\Eloquent\Builder $query
     * @return \Illuminate\Database\Eloquent\Builder
     */
    public function scopeCustomer($query)
    {
        return $query->where('role', Role::Customer);
    }
}

The use case is, we want to get the users who are either has admin or reviewer role.

For earlier versions of laravel(version < 5.8), we would do :

$users = App\User::admin()->orWhere(function (Builder $query) {
    $query->reviewer();
})->get();

There is nothing difficult in above eloquent query but having a closure just to filter a scope for an orWhere clause seems too much sometimes. But that case is no more valid after 5.8 releases. Now we can do :

$users = App\User::admin()->orWhere->reviewer()->get();

And that's it. It's a very small change but makes the code much more simpler to implement and read. Sometimes you might get lost in the closures when the query is large. Features like this makes it much more comfortable to implement.

.....

Read full article

Tags: laravel, php, orWhere, where, eloquent, querybuilder, 5.8

Laravel has one through

Laravel 5.8 is now released and one thing from the release notes many people might be waiting for is the hasOneThrough relationship. Laravel ealier had hasManyThrough, but with the addition of hasOneThrough it's more flexible. Let's dive into an example :

Consider we have User Model, Reviewer Model and Activity Model with structure :

users
    id - integer
    reviewer_id - integer

reviewers
    id - integer

activities
    id - integer
    user_id - integer

Each reviewer has one user, and each user is associated with one user activity record. If we want to get activity from reviewer, there is no reviewer_id column in activities table, in such case we can use hasOneThrough relation. Let's define the relationship now :

<?php

namespace App;

use Illuminate\Database\Eloquent\Model;

class Reviewer extends Model
{
    /**
     * Get the reviewer's activity.
     */
    public function activity()
    {
        return $this->hasOneThrough('App\Activity', 'App\User');
    }
}

To understand more, lets use Laravel conventional terms on the models. In this case the Activity is the final model we would need to relate, while User is the intermediate model.

As all other relationships in Laravel, we can also explicitely specify the foreign keys while definining hasOneThrough relation, Let's do that for this example :

<?php

namespace App;

use Illuminate\Database\Eloquent\Model;

class Reviewer extends Model
{
    /**
     * Get the reviewer's activity.
     */
    public function activity()
    {
        return $this->hasOneThrough(
            'App\Activity',
            'App\User',
            'reviewer_id', // Foreign key on users table...
            'user_id', // Foreign key on activities table...
            'id', // Local key on reviewers table...
            'id' // Local key on users table...
        );
    }
}

Now you can easily do :

$reviewer = Reviewer::first();
$activity = $reviewer->activity;

.....

Read full article

Tags: laravel, php, eloquent, hasonethrough, relationships, 5.8

Laravel liming chunk collection

  • Little Intro about chunk :

Laravel comes up with an excellent Eloquent ORM. There are lot of cool things in Eloquent, one of which is chunk method.

Usually when you need to process large data, lets say you want to update users table and assign a coupon code based on 3rd party APIs.

What you can do is :

User::get()->each(function($user){

    $coupon = API::getCouponCode($user->email);
    $user->coupon = $coupon;
    $user->save();
});

Doing this is good but if you have thousands of users, loading them all up at once to do the coupon saving process. That's going to take consume a hige memory at a time and maybe the server will be exusted because so much data is stored in memory for processing at a time.

The chunk method helps these kind of implementations. Chunking records means taking a batch of records by a limit, process those, take the next batch processing those and so on... The idea is you are taking a subset of data to process at a time instead of entire data loaded into memory.

The chunk method will retrieve a "chunk" of Eloquent models, feeding them to a given Closure for processing. Using the chunk method will conserve memory when working with large result sets.

Lets do the same but with chunk this time :

$recodsPerBatch = 50;

User::chunk($recodsPerBatch, function($user){

    $coupon = API::getCouponCode($user->email);
    $user->coupon = $coupon;
    $user->save();
});

Now, as you can probably guess this will take 50 user records at a time to process, once those are completed it will take next 50 untill all records are procesed by chunk closure.

  • The main problem Limit with Chunk :

Let's apply limit to the above example :

$recodsPerBatch = 50;

User::limit(100)->chunk($recodsPerBatch, function($user){

    $coupon = API::getCouponCode($user->email);
    $user->coupon = $coupon;
    $user->save();
});

If you would think laravel will chunk 50 records in 2 batches are we are limiting the total records to 100, oops not really.

Laravel will do series of queries like below to chunk the records :

select * from `users` limit 50 offset 0
select * from `users` limit 50 offset 50
select * from `users` limit 50 offset 100
...

The chunk method ignores any previous eloquent method which applies limit. It can be limit and offset OR take and skip...

This becomes a problem for which lot of people had raised an issue on laravel's github repo. But Laravel's core dev team mentioned this is the expected behaviour and rightfully so.. Chunk itself is using limit to convert entire collection in batches.

  • And.... Here it is.. The Solution you were waiting for :

Laravel has a chunk variation called chunkById. Let's use the first example and implement chunkById now :

$recodsPerBatch = 50;

User::chunkById($recodsPerBatch, function($user){

    $coupon = API::getCouponCode($user->email);
    $user->coupon = $coupon;
    $user->save();
});

The main and only fundamental difference between chunk and chunkById is how it structures the query.

select * from `users` where `id` > 0 order by `id` asc limit 50
select * from `users` where `id` > 0 and `id` > 50 order by `id` asc limit 50
select * from `users` where `id` > 0 and `id` > 50 and `id` > 100 order by `id` asc limit 50
select * from `users` where `id` > 0 and `id` > 50 and `id` > 100 and `id` > 150 order by `id` asc limit 50
...

If you observer in the queries done by chunkById :

  1. It's adding an order by clause to the id column (By the way you can specify the column as 3rd argument to chunkById method if itis not id)
  2. It adds where id > x each time it processes the next batch
  3. There is no offset used and id column is used conceptually as an offset.

This gives you an advantage that you can add your offset in the query using id column as a limit just like chunkById is doing internallu.

Let's limit the chunk in our example with 100 records to chunk :

$recodsPerBatch = 50;

$limit = 100;

$maxId = User::orderBy('id', 'asc')->offset($limit)->limit(1)->select('id')->first()->id;

User::where('id', '<', $maxId)->chunkById($recodsPerBatch, function($user){

    $coupon = API::getCouponCode($user->email);
    $user->coupon = $coupon;
    $user->save();
});

What we did was, even if we can not use limit directly, we got an id which falls just above the limit we want. Used that to add a where clause of where('id', '<', $maxId).

This will then chunk the 100 results, with 2 batches of 50 records in each batch. Cool, isn't it!

.....

Read full article

Tags: laravel, php, chunk, limit, eloquent

PHP argument validation using assert

When creating classes, we either pass arguments to the constructor function or have setter-getter functions to access class properties. These arguments may sometime have a specific format or requirements. For Example :

/**
 * $age integer
 *
 * @return  string
 */
function($age){
    return "You are $age years old";
}

Above function expects $age to be an integer. If any other type is passed, function will no work as expected.

The example was very granular. When your core classes have functions which affect important domain logics of your project, argument validation becomes an important part.

PHP comes with default function(or more specifically a language construct for PHP version 7) called assert, which basically checks if provided assertion if false. If yes then you can handle the exceptions in callback. Even if this is useful, I will strongly suggest using composer plugin webmozart/assert for this. It has very useful intuitive methods for the same.

  • Example :

Let's consider a function placeOrder() which takes array of $items having instances of Item class, a $shippingPrice and $customer having instance of Customer class.

<?php 

use Webmozart\Assert\Assert;

Class Order{

    /**
     * Array of Item instances
     * @var array
     */
    private $items;

    /**
     * Shipping Price
     * @var float
     */
    private $shippingPrice;

    /**
     * Customer Instance
     * @var Customer
     */
    private $customer;

    /**
     * Instantiate Order class
     * 
     * @param $items
     * @param $shippingPrice
     * @param $customer
     */
    function __construct($items, $shippingPrice, $customer)
    {
        Assert::isArray($items, '$items must be an array. Got: %s');
        Assert::allIsInstanceOf($items, '\Item', '$items must contain array of Item class instances. Got: %s');
        Assert::numeric($shippingPrice, '$shippingPrice must be numeric. Got: %s');
        Assert::isInstanceOf($customer, '\Customer', '$customer must be an instance of Customer class. Got: %s');

        $this->shippingPrice    = $shippingPrice;
        $this->customer         = $customer;
        $this->items            = $items;
    }

    /**
     * Places order
     * 
     * @return Order
     */
    public function placeOrder(){

        // PlaceOrder
    }
}

Now, when we instantiate the order class :

$customer = new Customer('John Doe');

$order = new Order([], 12.22, $customer);

We are passing an invalid value for $items argument. It will throw \InvalidArgumentException with error message : $items must contain array of Item class instances. Got: Array.

If you are using setter-getter functions instead of constructor parameters, you can have these assertions inside the setter functions. Above methodology is sometimes referred as defensive programming where we are making sure function will not break at the intial stages itself. The counter concept of defensive programming will be validating these arguments as and when they are actually used inside placeOrder function.

Advantages :

  1. Prevents errors caused due to unexpected type of arguments passed
  2. Easier to debug
  3. Easier to create unit tests and catch asset exceptions
  4. Easier to write readable custom error messages

Disadvantages :

If not properly thought, duplicate validations can cause redundancies. For example in above case, if instance of Item class contains a valid price is a concern of Item class. If that validation is also present in Order class as well, it created redundancies.

You can find more assertion options available here : https://github.com/webmozart/assert

.....

Read full article

Tags: laravel, php, validation, arguments, assert

Laravel make custom class using console command

Laravel ships with built in console make commands to generate class files for controllers, models, migrations etc. For example creating a resourse controller, you can do :

php artisan make:controller CustomerController --resource

Depending in your project structure and design patterns, you might be using different classes like repositories, services etc. We can take advantage of console commands to save time and create it just like we created controller above.

File generation :

We firstly create a template file which is like a blueprint of how thw new class file will look like. This file is called stub file.

Then when command is run, we simply copy the stub to class.php and replace the dummyclass, dummynamespace etc with the new files class and namespace.

Above steps are made much simpler by GeneratorCommand class.

The GeneratorCommand class :

Laravel's make commands extend an abstract class GeneratorCommand. This class basically has all the methods which do making a directory, getting content of stub, replacing class & namespace, saving the new file and so on.

This is an abstract class which then extends the Illuminate command class.

Example :

Let's create a make command for generating Repository for a model.

  • Create the stub :

Create a directory called stubs in laravel project root. I really like non-php files outside the app folder, you can put the stubs in the location you prefer.

Create a new file inside stubs directory called Repository.stub. We are eliminating the repository specific functions as the goal is to focus on creating class file. You can add your repository functions inside the stub or maybe create a base class and extend stubs class from it.

<?php

namespace DummyNamespace;

use App\DummyModel;

class DummyRepository
{
    protected $model;
    /**
       * Instantiate reporitory
       * 
       * @param DummyModel $model
       */
    public function __construct(DummyModel $model)
    {
        $this->model = $model;
    }

    // Your methods for repository
}

The usual idea is whatever is to be replaced, we prefix it with Dummy.

  • Creating command :

Let's create command scaffolding from artisan :

php artisan make:command RepositoryMakeCommand

The make command classes are generally named in format WhatToCreateMakeCommand. If we had to create a service class, we could say ServiceMakeCommand.

We need to edit the RepositoryMakeCommand.php file which will be generated inside app/Console/Commands folder.

<?php

namespace App\Console\Commands;

use Illuminate\Console\GeneratorCommand;
use Symfony\Component\Console\Input\InputArgument;
use Symfony\Component\Console\Exception\InvalidArgumentException;

class RepositoryMakeCommand extends GeneratorCommand
{
    /**
     * The name and signature of the console command.
     *
     * @var string
     */
    protected $name = 'make:repository';

    /**
     * The console command description.
     *
     * @var string
     */
    protected $description = 'Create a new model repository';

    /**
     * The type of class being generated.
     *
     * @var string
     */
    protected $type = 'Repository';

    /**
     * The name of class being generated.
     *
     * @var string
     */
    private $repositoryClass;

    /**
     * The name of class being generated.
     *
     * @var string
     */
    private $model;

    /**
     * Execute the console command.
     *
     * @return bool|null
     */
    public function fire(){

        $this->setRepositoryClass();

        $path = $this->getPath($this->repositoryClass);

        if ($this->alreadyExists($this->getNameInput())) {
            $this->error($this->type.' already exists!');

            return false;
        }

        $this->makeDirectory($path);

        $this->files->put($path, $this->buildClass($this->repositoryClass));

        $this->info($this->type.' created successfully.');

        $this->line("<info>Created Repository :</info> $this->repositoryClass");
    }

    /**
     * Set repository class name
     *
     * @return  void
     */
    private function setRepositoryClass()
    {
        $name = ucwords(strtolower($this->argument('name')));

        $this->model = $name;

        $modelClass = $this->parseName($name);

        $this->repositoryClass = $modelClass . 'Repository';

        return $this;
    }

    /**
     * Replace the class name for the given stub.
     *
     * @param  string  $stub
     * @param  string  $name
     * @return string
     */
    protected function replaceClass($stub, $name)
    {
        if(!$this->argument('name')){
            throw new InvalidArgumentException("Missing required argument model name");
        }

        $stub = parent::replaceClass($stub, $name);

        return str_replace('DummyModel', $this->model, $stub);
    }

    /**
     * 
     * Get the stub file for the generator.
     *
     * @return string
     */
    protected function getStub()
    {
        return  base_path('stubs/Repository.stub');
    }
    /**
     * Get the default namespace for the class.
     *
     * @param  string  $rootNamespace
     * @return string
     */
    protected function getDefaultNamespace($rootNamespace)
    {
        return $rootNamespace . '\Repo';
    }

    /**
     * Get the console command arguments.
     *
     * @return array
     */
    protected function getArguments()
    {
        return [
            ['name', InputArgument::REQUIRED, 'The name of the model class.'],
        ];
    }
}
  • Registering the command :

Finally we need to register this command to make available in artisan console. We can do that by adding it inside app/Console/Kernel.php.

Inside protected $commands = [] array add the class path :

\App\Console\Commands\RepositoryMakeCommand::class,
  • Using the command :
php artisan make:repository Customer

This will create a CustomerRepository.php inside app/Repo folder :

<?php

namespace App\Repo;

use App\Customer;

class CustomerRepository
{
    protected $model;
    /**
       * Instantiate reporitory
       * 
       * @param Customer $model
       */
    public function __construct(Customer $model)
    {
        $this->model = $model;
    }

    // Your methods for repository
}

You can extend this implementation as per your project structure needs. This indeed saves time in longer run and makes sure all generated classes have common blueprint structure.

.....

Read full article

Tags: laravel, php, helpers, collections