Gun Review: SIG SAUER P365 Manual Safety

Josh Wayner for TTAG

Unless you’ve been living under a rock, you’ll know that SIG SAUER has upped the ante in the concealed carry and self-defense world with their P365. For the cave dwellers among you, the P365 — SIG calls it “America’s most popular handgun” — is a sub-compact 9mm pistol that boasts excellent accuracy and a staggering 10+1 standard capacity in a very compact form factor.

Recently they raised the bar by making their already innovative pistol safer in the eyes of many people, this writer included.

The P365 variant here, announced at this year’s SHOT Show, features a manual thumb safety. That may not seem like that much of a big deal until you realize that a manual safety is a make or break option for many, many shooters. I conducted an informal survey among several dozen people I encountered in the course of my daily travels and about two-thirds of the crowd wanted an external safety.

Tactical polo bros rarely value a manual safety and sometimes consider it to be a dangerous feature or an outright liability. This high-speed mentality often ignores the fact that the majority of people who carry aren’t gun people and want simple, safe features that they are comfortable with.

I won’t get into the nuances of this mentality as I could never consume enough Monster Energy flavored vape cartridges to put myself into the mindset necessary to relay it to you.

Suffice to say, even routine actions like loading and holstering can lead to an accidental discharge or an unsafe situation when no manual safety is present. The fact that the gun is basically ready to fire the moment you put your hand on the grip is not something that delights many people, especially those who carry off-body in a purse, backpack, or diaper bag.

The addition of a manual safety model to the P365 line is the thing that fully convinced me to make this a daily use carry gun. I previously reviewed the original P365 and liked it, but I never really felt great about carrying it. The triggers on the P365 pistols I have tested are light and crisp, just like a full-size pistol.

The small grip and crisp trigger made it so that I was somewhat uncomfortable with carrying it for fear of an accidental discharge. The addition of a manual safety makes me completely comfortable with it now in both a pocket holster or IWB.

Josh Wayner for TTAG

I have carried a Smith & Wesson .38 J-Frame for years and it has no manual safety. It also has a super-heavy double action trigger that really can’t be pulled by accident.

The SIG P365 has a trigger pull that’s significantly lighter at about 7 lbs, which puts it in the same range as most regular 1911s and some AR rifles. That’s relatively light, which, in my book, makes it absolutely necessary to exercise caution when carrying with a round chambered. There is no trigger safety, so it is imperative that you are unerringly careful.

While it may not seem like a huge upgrade, the manual thumb safety makes the P365 one of the safest everyday carry guns on the market today. The safety itself is ambidextrous and clicks positively into position. A great feature of this model is that it can be loaded and unloaded with the safety on. The same can’t be said for many other guns which only allow loading with the thumb safety off.

15 round magazines do add a bit of length, however they are great for a backup mag. (Josh Wayner for TTAG)

Aside from the safety, the features of the gun are the same as the standard P365 (see our review here). Night sights are standard. They’re easy to use and are very easy to pick up in low light or total darkness. Magazines drop free with no hangups. Included with the gun are two, 10-round mags, one with a short finger extension and the other a flush fit version. Twelve and 15-round extended mags are available, too.

I tested this new SIG P365 version with the safety with a mountain of ammunition to fully ensure that it’s able to pass my own standards for a carry gun and to dispel lingering rumors of problems with the design.

The P365 I received saw over 2,000 rounds right out of the box and was never cleaned or even wiped down. I fired just about every brand of ammo and recorded my results below.

SIG SAUER 115gr 365———————————-1075fps, 1.25”
SIG SAUER 365 115gr FMJ—————————–1067fps, .75”
SIG SAUER 365 115gr V-Crown————————1079fps, .75”
SIG SAUER 124gr V-Crown—————————–1140fps, 1”
SIG SAUER 124gr FMJ———————————-1112fps, 1.25”
SIG SAUER 115gr V-Crown—————————–1190fps, 1.25”
SIG SAUER 147gr Elite Competition———————899fps, .75”
Hornady 135gr +P Critical Duty————————1050fps, 1.5”
Hornady 124gr +P Critical Duty————————1130fps, 1.5”
Hornady Custom 147gr XTP—————————–950fps, 2”
Hornady Critical Defense 115gr FTX——————–1123fps, 2”
Buffalo Bore 147gr Outdoorsman———————–1000fps, 2.5”
Buffalo Bore Barnes 95gr +P+ ————————–1349fps, 2”
Buffalo Bore 147gr JHP +P+ —————————-1060fps, 1”
Black Hills 115gr FMJ————————————1075fps, 1”
Black Hills 100gr HoneyBadger +P———————-1175fps, .5”
Black Hills 125gr Subsonic HoneyBadger—————-973fps, 2”
Lehigh Defense 70gr HERO——————————1490fps .5”
Lehigh Defense 90gr Xtreme Defense +P—————1301fps, 1”
Lehigh Defense 105gr Controlled Fracture————-1100fps, .75”
Lehigh Defense 105 Max Expansion——————–1050fps, .75”

Accuracy shown is the average of three, five-shot groups at 15 yards and velocity is the average of 10 rounds fired over an Oehler 35P chronograph five feet from the muzzle.

As far as general performance was concerned, the P365 shot anything and everything I put through it. I had absolutely no failures to feed, to fire, or to eject with any ammo tested. The pistol was very accurate, especially for a compact carry gun, with virtually everything I fired through it.

Black Hills 100gr +P is an amazing load. (Josh Wayner for TTAG)

Of particular note was the accuracy I got from the Black Hills 100gr HoneyBadger. I have tested this load in numerous guns across many months and even did a standalone review here on TTAG. I keep coming back to this load in my article notes as it is just so damn accurate and reliable. I love all the ammo I test, but for whatever reason this load from Black Hills is always the most accurate at every range.

Close in at 15 yards it was matched by Lehigh Defense’s HERO load, but out to 25 yards it quickly outpaces all the others and behaves like a little rifle round. I was easily able to make hit after hit at 50 yards on a 10” plate using the Black Hills load.

Josh Wayner for TTAG

That said, I am always impressed by the ammo innovations from the other companies featured in this article. For general carry, SIG’s 365 load is hard to beat. You can read the review of that here.

Hornady makes some of the best carry ammo out there and their Critical Duty loads inspire plenty of confidence. Look forward to a detailed review of the 135gr +P this summer. Buffalo Bore makes some of the most powerful 9mm available and you can find a review of the 95gr +P+ here.

Josh Wayner for TTAG

The P365 Manual Safety is one of the most compact, reliable, and accurate handguns you can own today. I will be using this pistol across the summer for most of ammo testing. I think that it will serve me and you well as you read the results I get. I don’t recommend this gun lightly, either.

Specifications: SIG SAUER P365 Manual Safety

Caliber: 9x19mm
Barrel Length: 3.1”
Overall Length: 5.8”
Width: 1”
Weight: 17.8oz
Sights: SIG XRAY Night Sights
Magazine: 10 round standard, 12-round magazine and 15-round magazine available
MSRP: $599 (seen about $100 lower retail)

Ratings (Out of Five Stars):

Accuracy * * * * *
This is probably the most accurate compact semi-automatic everyday carry gun I’ve ever used. It shot like a full-size pistol and refused to miss even small targets.

Reliability * * * * *
I put a stupid number of rounds through the P365-MS in only a couple of range trips to ensure it goes bang every time. In 2,000+ rounds it never failed to feed or fire.

Ergonomics * * * * *
This is a well-engineered pistol that feels great in the hand and carries easily on the hip. The manual safety lever feels very positive and is easy to disengage.

Customize This * * * * *
It already comes with XRAY3 night sights. The ability to change out grip modules, change colors, add lasers, swap sights, and go from 10+1 to as many as 15+1 makes this one of the most user-friendly carry guns out there.

Aesthetics * * * * 
The gun is small and efficient. I love the smooth lines and high capacity (for a sub-compact). Many small carry guns are plain, if not downright ugly, but SIG managed to make the P365 fairly pretty and functional.

Overall * * * * *
Just like the article says, this may be the best carry gun on the market right now. There is really nothing out there that even comes close when you blend size, capacity and features. The addition of the safety option makes it darn close to perfect.

via The Truth About Guns
Gun Review: SIG SAUER P365 Manual Safety

Server-side Font-Awesome rendering with Laravel

Font Awesome Blade directives for Laravel

Latest Stable Version License Travis (.com) StyleCI Scrutinizer code quality (GitHub/Bitbucket)

This package will render font awesome icons in your views on the server side. This removes the need to add extra JavaScript or webfont resources on the client side and in doing so reduces the size of your website significantly.

This is achieved by replacing the icons with their svg counterpart before sending the response to the client.

Requirements

  • PHP >= 7.1.3
  • Laravel >= 5.6

Installation

Install the package using Composer.

composer require jerodev/laravel-font-awesome 

Service Provider

The package will be auto-discovered by Laravel. If you disabled auto-discovery, you should add the following provider to your config/app.php file.

\Jerodev\LaraFontAwesome\FontAwesomeServviceProvider::class, 

Usage

To use Font Awesome icons in your view there are a few new blade directives.

// Let the package discover the best library for this icon. @fa('laravel')  // Define the library that should be used. @far('circle') // Regular @fas('circle') // Solid @fab('laravel') // Brands

When using the @fa() directive. The package will scan the different Font Awesome libraries and use the first library where it finds the icon.

The order in which the libraries are scanned is regular, brands, solid. But this can be modified in the configuration.

Middleware

This package includes a middleware that injects a minimal stylesheet into your views on render. By default, this middleware is added to the web middleware group.

If you don’t want to have the style injected automatically, you can disable middleware.all_requests in the configuration. In this case, you will have to add the middleware to selected routes yourself or add your own CSS.

The middleware you should use is \Jerodev\LaraFontAwesome\Middleware\InjectStyleSheet::class.

Configuration

The package contains a few configuration options that can be modified by first publishing the config file using the command below. This will create a fontawesome.php file in your config folder.

php artisan vendor:publish --provider="Jerodev\LaraFontAwesome\FontAwesomeServviceProvider" 
Key Type Default value Description
libraries string[] ['regular', 'brands', 'solid'] The icon libraries that will be available. This is also the order in which the libraries will be searched for icons.
middelware.all_requests boolean true When enabled, the stylesheet needed for the icons will automatically be injected on every request returning html.

To Do

Currently the package only supports basic icon rendering. There is no support for special cases, such as: stacking icons or masking icons, because I never used these myself.

In the future however, I want to add these as well to make this package support the full api that is available using the Font Awesome library.

via Laravel News Links
Server-side Font-Awesome rendering with Laravel

Troubleshooting Errors and Performance Issues in Laravel Using Logs

In a perfect world, there wouldn’t be any errors or bugs in production applications. However, we don’t live in a perfect world, and from experience, you know there is no such thing as a bug-free application. If you are using the Laravel framework, you can leverage its log tracking and error logging to catch bugs early and enhance the performance of your Laravel-based application.

Laravel comes pre-packaged with tools to help you track and monitor events. This reduces the effort required to track down those bugs. It comes with a stackable logging system built on top of the popular Monolog library. It also allows you to set up multiple channels based on the severity of the log or event. These channels include stack (stacked), single, daily, Slack, syslog, monolog, SolarWinds® Papertrail®, and so on.

Single Server Environment

Configuring logging for a single server production environment is simple and straightforward. Since the data is always retained on the server, we do not have to worry about keeping the logs offsite. Laravel handles the log rotation, so you do not have to manually maintain that information either. The following configuration logs debug level errors and exceptions to a log file.

return [ 'default' => env('LOG_CHANNEL', 'stack'), 'channels' => [ 'stack' => [ 'driver' => 'stack', 'channels' => ['daily'], 'ignore_exceptions' => false, ], 'daily' => [ 'driver' => 'daily', 'path' => storage_path('logs/laravel.log'), 'level' => 'debug', 'days' => 14, ] ] ];

Production Environments

Production environments are dynamic and they often scale up and down, which means that the servers will be created or destroyed with increases and decreases in user traffic or load. This means you cannot rely on file-based logging because storage is ephemeral and load balancers make it quite difficult to track down which web server received a request. In order to aggregate and save the logs for any retention period, you’ll have to set up a dedicated syslog server or use a service like Papertrail, which can store logs and make them accessible in a web browser.

A screenshot of logs in Papertrail.

While setting up a logs server is easy, it’s not beneficial for everyone. A decent amount of Linux knowledge and a dedicated server is required, especially if you’re managing more than one production environment.

I like Papertrail for its ease of setup and use. You just have to set up the logging channel to “papertrail” and add two configurations into your environment file (.env).

PAPERTRAIL_URL=logsXXX.papertrailapp.com PAPERTRAIL_PORT=52204

You should be able to get the values for these settings in your Papertrail account. There is also documentation on the Laravel site about configuring the Papertrail channel.

Once set up you can easily monitor and search your logs.

Common Errors

5XX Server Errors

These are usually server side or hard errors which are thrown on the server side. These errors usually mean something unexpected happened and your program didn’t know how to handle the situation. The error could be caused by many things and we’ll describe a few common ones below.

Code Syntax Errors

These errors are easy to detect if you have debug turned on or if you’re looking at the error log. Errors usually start with syntax error, unexpected… and will give you the name of the file and line number of the code that caused the error.

PHP version Compatibility

PHP 5.6 and 7.0 hit EOL (end of life) last year and chances are your server isn’t using those versions or won’t be for much longer. I’ve listed a few tools below which you can use to check your code for compatibility.

  • php7mar – PHP 7 Migration Assistant Report (MAR) (Recommended)
  • phpstan – PHP Static Analysis and compatibility check
  • phan – A static analyzer, PHP 7 checker

504 Gateway Time-out

These errors usually happen when you’re running the PHP outside of Apache/Nginx as a separate process (for example, FPM or CGI) and the response isn’t returned to your web server in a timely manner. Performance tracking middleware defined in the Performance issues section later in the article might be helpful for you for tracking these issues.

Database Connection Issues

This can be solved by checking the credentials for DB connection in your environment file and making sure that the credentials are correct, the application communicates with the database server, and the database/tables exist.

The following are a few log messages thrown by this issue:

SQLSTATE[HY000] [1045] Access denied for user [USERNAME] SQLSTATE[HY000] [2002] Operation timed out

4XX Client Errors

These errors are considered soft errors and are usually thrown when we receive an invalid request from a client.

404 Page Not Found

This error usually means that Laravel was not able to find the controller, controller method or a 404 error was thrown by your custom code. Some common steps to debug these would be to confirm that the route, controller, and method exist. If they all exist, then check to see if your code is throwing a 404 error.

Some common 404 errors are as follows:

Error Message Possible Reason(s)
404 | Not Found 1. Route is not defined
2. No results were found when findOrFail() was called for a model
Class [CLASS_NAME] does not exist Controller class does not exist or the path to it is incorrect
Method [METHOD_NAME] does not exist Method does not exist or the path to it is incorrect

Note: findOrFail() callback on a model returns a 404 error if no matching result is found in the database.

419 Page Expired

Laravel comes pre-loaded with a CSRF protection and it requires you to pass that token with all your non-GET requests. This requires a valid session to be maintained on the server side.

If you wish to exclude any requests from CSRF verification, then add those requests to your VerifyCsrfToken middleware.

/** * The URIs that should be excluded from CSRF verification. * * @var array */ protected $except = [ 'foo/bar', 'foo/*', 'http://example.com/foo/bar', 'http://example.com/foo/*', ];

422 Unprocessable Entity

This error usually means that the data you posted using AJAX was invalid for this request. This happens when you have a Request Rules setup. Laravel validates each request before it passes onto your controller method. The issue may be in your data or the request rule used for the request.

When an AJAX request is made JSON content type request it will return an error message like following, explaining what the issue is.

{ "message": "The given data was invalid.", "errors": { "name": ["The name field is required."], "email": ["The email field is required."] } }

Out of Memory Error

Memory limits on PHP applications exist for a reason. You do not want to allow your application an unlimited amount of memory. This error message exists for that very reason. If you’re receiving an out of memory error it means that there is a memory leak in your application. Maybe you forgot to reset an object, and the size continues to increase until the application doesn’t have any memory left to run.

Allowed memory size of [SIZE] bytes exhausted (tried to allocate [SIZE] bytes)

The following loop will keep running until the program runs out of memory:

$data = []; $i = 100; while($i > 100) { $data[] = time(); $i++; }

Performance Issues

Slow Queries

A single slow query can slow down your entire application. You did everything you could think of and made sure that all of your queries were using indexes and didn’t do anything; however, maybe you still missed something… How do you monitor for that? It’s simple enough in Laravel. Add the following code to your AppServiceProvider::boot() method:

 public function boot() { \DB::listen(function($sql) { if($sql->time > 1000){ \Log::info(“SLOW QUERY DETECTED:”); \Log::info($sql->sql); \Log::info($sql->bindings); \Log::info($sql->time); } }); }

This will log every query that takes longer than a second (1000 ms) to execute. You can adjust the time accordingly to fit your needs. Running an explain on your query will provide you with some insights as to why your query is running slow.

Slow Response Time

Tracking slow response time in Laravel is relatively simple. You can define a middleware like the following example and include it in your requests.

<?php //File Name: app/Middlewares/Performance.php namespace App\Http\Middleware; use Closure; class Performance { public function handle($request, Closure $next) { return $next($request); } public function terminate($request, $response) { $execution_time = microtime(true) - LARAVEL_START; if($execution_time > 1){ // You can change the 1 to a desired amount in seconds \Log::info("Slow Response[{$execution_time}]: You should log some information here."); } } }

After creating the middleware do not forget to include it in your Kernel.php like this:

 protected $routeMiddleware = [ 'auth' => \App\Http\Middleware\Authenticate::class, .... 'performance' => \App\Http\Middleware\PerformanceTracker::class, ];

You can also include it in your web or API route groups if you want to include it in all requests on web/API.

Caching

Caching your content is another way to optimize your application. You don’t have to fetch data from your database if it doesn’t change often. Laravel provides drivers for file, db, memcache, and redis backends. By default, Laravel uses a file-based caching system. Configuring caching in Laravel is easy and can be done in minutes.

Recap

In this article, you learned how we can use logging to reduce the time and effort spent on debugging your code while improving the performance of your application at the same time.

As you see, you can use logging to reduce the time and effort spent debugging your code while improving the performance of your application at the same time.

Data is your friend and it’s there waiting for you to make use of it! Tools like SolarWinds® Papertrail® make it easier to access logs and debug problems. Check out the Papertrail free trial.

via Laravel News Links
Troubleshooting Errors and Performance Issues in Laravel Using Logs

SCOTUS: Ban on “FUCT” trademark registration violates First Amendment

SCOTUS: Ban on “FUCT” trademark registration violates First Amendment

Fuct

Federal law prohibits the registration of trademarks that are “immoral or scandalous.” At least it did until today, when the Supreme Court held that the requirement violated the First Amendment.

The case focused on artist and entrepreneur Erik Brunetti, who sells clothing under the trademark FUCT. Brunetti claims the mark is “pronounced as four letters, one after the other: F-U-C-T.” But a lot of people have interpreted it as (in the words of the government’s lawyer in the case) “the profane past participle form of a well-known word of profanity.”

Beyond that, the US Patent and Trademark Office looked at the products being sold under the FUCT mark. “Brunetti’s website and products contained imagery, near the mark, of ‘extreme nihilism’ and ‘antisocial’ behavior,” the Supreme court noted in its Monday opinion. The trademark office concluded that the FUCT mark “communicated misogyny, depravity, and violence,” and rejected the registration.

But that’s exactly the kind of analysis a government agency isn’t allowed to do under the First Amendment, a six-justice majority ruled.

Bong hits 4 Jesus

“If a trademark registration bar is viewpoint-based, it is unconstitutional,” the court held.

The justices noted that the trademark office has a track record of making viewpoint-based judgments about trademark registrations under the immorality clause.

A registration for “YOU CAN’T SPELL HEALTHCARE WITHOUT THC” got rejected as immoral and scandalous, while a registration for “SAY NO TO DRUGS—REALITY IS THE BEST TRIP IN LIFE” was accepted. Beverages called “MARIJUANA COLA” and “KO KANE” were also refused trademark registrations.”JESUS DIED FOR YOU” could be registered, but “BONG HITS 4 JESUS” couldn’t.

That’s a problem, the court concluded, because “a law disfavoring ‘ideas that offend’ discriminates based on viewpoint, in violation of the First Amendment.”

The case wasn’t about Brunetti’s right to sell products with the “FUCT” label on them. It’s perfectly legal to sell products with an unregistered mark, and these marks even have limited legal protections. But registration provides much stronger legal powers to prevent others from using the trademark without permission.

The ruling follows a landmark 2017 precedent

The ruling comes two years after another Supreme Court ruling on offensive trademarks. In that case, the high court struck down language prohibiting the registration for trademarks that were disparaging to any individual or group of people.

That case focused on an Asian-American rock band called “The Slants.” The band was attempting to reclaim a derogatory term for Asian people, but the trademark office rejected the trademark for disparaging Asians. The Supreme Court ruled it was unconstitutional for the government to prohibit trademarks that disparage people while allowing trademarks that praise them.

Now the Supreme Court has applied the same reasoning to the rule against immoral and scandalous trademarks. However, the high court left open the possibility that Congress could ban a narrower class of trademarks that are lewd, sexually explicit, or profane. If Congress chose to pass such a law, then trademarks using the F-word might once again be excluded from registration. But if you’ve always wanted to own a registered trademark with a swear word in it, now is your chance.

via Ars Technica
SCOTUS: Ban on “FUCT” trademark registration violates First Amendment

Broadcasting with Laravel, Passport, Pusher & Vue.js

When using Laravel + Passport to create a headless application, in this case the API for the backoffice of an commerce platform, we couldn’t find one clear tutorial how to set up both the API (with for example the Broadcast authorization route) and on the other hand the Pusher integration in Vue, with the token that was originally created by Passport during authentication. We wrote this step by step tutorial after implementing the solution.
via Laravel News Links
Broadcasting with Laravel, Passport, Pusher & Vue.js

Laravel Eloquent UUID

Eloquent UUID

GitHub stars

GitHub tag (latest SemVer) Build status Packagist PHP from Packagist Packagist

Introduction

A simple drop-in solution for providing UUIDv4 support for the IDs of your Eloquent models.

Installing

You can install the package via composer:

composer require goldspecdigital/laravel-eloquent-uuid

Usage

When creating a Eloquent model, instead of extending the standard Laravel model class, extend from the model class provided by this package:

<?php  namespace App\Models;  use GoldSpecDigital\LaravelEloquentUUID\Database\Eloquent\Model;  class BlogPost extends Model {  // }

User model

The User model that comes with a standard Laravel install has some extra configuration which is implemented in its parent class. This configuration only consists of implementing several interfaces and using several traits.

A drop-in replacement has been provided which you can use just as above, by extending the User class provided by this package:

<?php  namespace App\Models;  use GoldSpecDigital\LaravelEloquentUUID\Foundation\Auth\User as Authenticatable;  class User extends Authenticatable {  // }

Generating UUIDs

If you don’t specify the value for the primary key of your model, a UUIDv4 will be automatically generated. However, if you do specify your own UUIDv4 then it will not generate one, but instead use the one you have explicitly provided. This can be useful when needing the know the ID of the model before you have created it:

// No UUID provided (automatically generated). $model = Model::create(); echo $model->id; // abb034ae-fcdc-4200-8094-582b60a4281f  // UUID explicity provided. $model = Model::create(['id' => '04d7f995-ef33-4870-a214-4e21c51ff76e']); echo $model->id; // 04d7f995-ef33-4870-a214-4e21c51ff76e

Running the tests

To run the test suite you can use the following commands:

# To run both style and unit tests. composer test # To run only style tests. composer test:style # To run only unit tests. composer test:unit

If you receive any errors from the style tests, you can automatically fix most, if not all of the issues with the following command:

Contributing

Please read CONTRIBUTING.md for details on our code of conduct, and the process for submitting pull requests to us.

Versioning

We use SemVer for versioning. For the versions available, see the tags on this repository.

Authors

See also the list of contributors who participated in this project.

License

This project is licensed under the MIT License – see the LICENSE.md file for details.

via Laravel News Links
Laravel Eloquent UUID

Laravel Data Sync

Laravel Data Sync

Laravel utility to keep records synced between environments through source control

Installation & Usage

  • Via composer: composer require distinctm/laravel-data-sync
  • Run php artisan vendor:publish --provider="distinctm\LaravelDataSync\DataSyncBaseServiceProvider" --tag="data-sync-config" to publish config file. Specify directory for sync data files (default is a new sync directory in the project root)
  • Create a JSON file for each model, using the model name as the filename. Example: Product.json would update the Product model
  • Use nested arrays in place of hardcoded IDs for relationships
  • Run php artisan data:sync (or php artisan data:sync --model={model} with the model flag to specify a model)

Optional

If using Laravel Forge, you can have the data sync run automatically on deploy. Edit your deploy script in Site -> App to include:

if [ -f artisan ] then php artisan data:sync php artisan migrate --force fi 

Notes

  • use studly case for model name relationships as JSON keys (example: ‘option_group’ => ‘OptionGroup’). This is important for case sensitive file systems.
  • empty values are skipped
  • the criteria/attributes for updateOrCreate are identified with a leading underscore
  • nested values represent relationships and are returned using where($key, $value)->first()->id
  • order of import can be set in config/data-sync.php with an array:
return [ 'path' => base_path('sync'), 'order' => [ 'Role', 'Supervisor', ] ]; 

Examples

User.json:

[ { "name": "Ferris Bueller", "properties->title": "Leisure Consultant", "phone_numbers->mobile": "555-555-5555", "phone_numbers->office": "", "_email": "ferris@buellerandco.com", "department": { "name": "Management", "location": { "name": "Chicago" } } } ]

translates to…

User::updateOrCreate([  'email' => 'ferris@buellerandco.com', ],[  'name' => 'Ferris Bueller',  'properties->title' => 'Leisure Consultant',  'phone_numbers->mobile' => '555-555-5555',  'department_id' => Department::where('name', 'Management')  ->where('location_id', Location::where('name', 'Chicago')->first()->id)  ->first()  ->id, ]);

Role.json:

[ { "_slug": "update-student-records" }, { "_slug": "borrow-ferrari" }, { "_slug": "destroy-ferrari" } ]

translates to…

 Role::updateOrCreate(['slug' => 'update-student-records']);   Role::updateOrCreate(['slug' => 'borrow-ferrari']);   Role::updateOrCreate(['slug' => 'destroy-ferrari']);

RoleUser.json (pivot table with model):

[ { "_user": { "email": "ferris@buellerandco.com" }, "_role": { "slug": "update-student-records" } }, { "_user": { "email": "ferris@buellerandco.com" }, "_role": { "slug": "borrow-ferrari" } }, { "_user": { "email": "ferris@buellerandco.com" }, "_role": { "slug": "destroy-ferrari" } } ]

translates to…

 RoleUser::updateOrCreate([  'user_id' => User::where('email', 'ferris@buellerandco.com')->first()->id,  'role_id' => Role::where('slug', 'update-student-records')->first()->id,  ]);   RoleUser::updateOrCreate([  'user_id' => User::where('email', 'ferris@buellerandco.com')->first()->id,  'role_id' => Role::where('slug', 'borrow-ferrari')->first()->id,  ]);   RoleUser::updateOrCreate([  'user_id' => User::where('email', 'ferris@buellerandco.com')->first()->id,  'role_id' => Role::where('slug', 'destroy-ferrari')->first()->id,  ]); 

via Laravel News Links
Laravel Data Sync

GitHub Actions for PHP Developers

If you’re a developer using GitHub, you probably have heard of GitHub Actions. GitHub’s new automated workflow system (similar to the CI feature of GitLab). It has been announced in October 2018 and is currently still in beta (You can sign up here to get access).

I recently got access to the beta and began developing Actions suited for my projects. As I currently spend most of my time writing code in PHP, the Actions mentioned in this article are focused on that language. The logic can be easily ported to other languages though.

What are Workflows and Actions and what can I do with them?

A Workflow is a collection of multiple Actions which can be triggered by multiple GitHub webhook events (for example when code is pushed to the repository, a Pull Request has been merged or when a new Issue has been created)

An Action can basically do everything: run your test suite, publish a package to npm, deploy your site to your server, send a Slack message. You name it.
The code for an Action can live in the project repository itself, in a separate public GitHub repository or in a Docker Hub image.

Your Workflow is defined in the main.workflow file in the .github folder in your repository. This means your Actions are written in code and are in version control. If you like to work with a GUI, Workflows and Actions can also be configured and edited in a visual editor on github.com.

Screenshot of the Visual Editor for GitHub Actions on github.com
Screenshot of the Visual Editor for GitHub Actions on github.com

In this post I’m going to cover 3 Actions which I think could be useful for PHP developers in their daily workflows:

  • Run your phpunit test suite
  • Run phpinsights
  • Run php-cs-fixer

I’ve published a sample Laravel application, with all three Actions configured, on GitHub. You can clone it, fork it and see for yourself how the Actions are set up.
The process of adding those Actions is documented in this Pull Request.

As mentioned earlier, your Workflow and Actions are defined in a main.workflow file. The final file for our sample application looks like this:

workflow "phpunit / phpinsights / php-cs-fixer" { on = "push" resolves = [ "phpunit", "phpinsights", "auto-commit-php-cs-fixer", ] } action "composer install" { uses = "MilesChou/composer-action@master" args = "install -q --no-ansi --no-interaction --no-scripts --no-suggest --no-progress --prefer-dist" } action "phpunit" { needs = ["composer install"] uses = "./actions/run-phpunit/" args = "tests/" } action "phpinsights" { needs = ["composer install"] uses = "stefanzweifel/laravel-phpinsights-action@v1.0.0" args = "-v --min-quality=80 --min-complexity=80 --min-architecture=80 --min-style=80" } action "php-cs-fixer" { uses = "docker://oskarstark/php-cs-fixer-ga" } action "auto-commit-php-cs-fixer" { needs = ["php-cs-fixer"] uses = "stefanzweifel/git-auto-commit-action@v1.0.0" secrets = ["GITHUB_TOKEN"] env = { COMMIT_MESSAGE = "Apply php-cs-fixer changes" COMMIT_AUTHOR_EMAIL = "jon.doe@example.com" COMMIT_AUTHOR_NAME = "Jon Doe" } }

GitHub Actions are not written in YAML or JSON, but in HSL. If you’ve worked with Terraform in the past, the syntax might look familiar to you.

I won’t go deep what each keyword in the Workflow syntax does (uses, needs, secrets, etc.). You should rather read the documentation.

The most important keywords for us right now are:

  • uses: Set which Action in a Workflow should be used
  • needs: Set which Action must successfully run, before this Action runs. (Similar to Dependencies)
  • env: Environment variables defined in the Workflow file itself. Allows you as a Action consumer to change things within an Action
  • secrets: Secret environment variables like API keys which should not be stored in the repository

Dockerfile? Oh no!

Each Action is executed in a Docker container. Therefore, each Action needs a Dockerfile. If you now think: "Oh no! I don’t know Docker!", then we have something in common. I’ve read and heard a lot about Docker over the years, but never really worked with it.

The good thing is that you don’t have to be a Docker expert to create or work with Actions. All you need to do is set a base image and then you’re good to go. The "core" Actions code can basically be written in any language.

One caveat you have to keep in mind when working with Actions, is that even though each Action is run in its own container, the underlying filesystem is shared with other Actions. Meaning: If one Action changes repository files, these changes are also available in other Actions.

So let’s get started with our first PHP Action.

Install Composer dependencies

To do anything with a Laravel project, we first have to install its dependencies with composer.

This can be accomplished by using the general composer Action developed by MilesChou.

workflow "composer install" { on = "push" resolves = [ "composer install" ] } action "composer install" { uses = "MilesChou/composer-action@master" args = "install -q --no-ansi --no-interaction --no-scripts --no-suggest --no-progress --prefer-dist" }

As all following task depend on composer dependencies, all other task should have "composer install" defined in the needs keyword. The "composer install" Action is therefore executed before all other Actions. (GitHub is clever enough though, to only run the Action once).

Action 1: Run phpunit test suite

One of the most common things to do in Continuous Integration is running your projects test suite on each code push.

As the test suite setup differs from project to project, I won’t use a publicly available Action to run phpunit. I will use an Action defined in the project itself. In the root of my project I create an actions/run-phpunit-folder and within it the following files:

Dockerfile

FROM php:7.3-cli-alpine LABEL "com.github.actions.name"="phpunit-test" LABEL "com.github.actions.description"="Run phpunit test suite" LABEL "com.github.actions.icon"="check-circle" LABEL "com.github.actions.color"="green" ADD entrypoint.sh /entrypoint.sh ENTRYPOINT ["/entrypoint.sh"]

entrypoint.sh

 set -eu cp .env.example .env php artisan key:generate vendor/bin/phpunit $*

In the Dockerfile we tell Actions that we want to use PHP 7.3 and that we want to execute the entrypoint.sh-file next.

The entrypoint.sh-file is where the logic of the Action lives. To run the test suite, we first create a copy of the example Laravel environment file and generate a fresh application key.

Next, phpunit is executed. Any provided arguments (args) are passed down and will be placed where the $* variable is.

If your project needs more PHP extensions or a MySQL database, the entrypoint.sh would be the place where you set these things up. (That’s why I didn’t use an already existing Action. The setup differs from project to project.)

In our Workflow file, we can now add the Action:

workflow "phpunit" { on = "push" resolves = [ "phpunit" ] } action "composer install" { uses = "MilesChou/composer-action@master" args = "install -q --no-ansi --no-interaction --no-scripts --no-suggest --no-progress --prefer-dist" } action "phpunit" { needs = ["composer install"] uses = "./actions/run-phpunit/" args = "tests/" }

Even though this was quite easy to set up, I personally wouldn’t use GitHub Actions to run the test suite for my bigger projects yet.

Other CI services like Travis or Circle CI are much better suited for this task. These services give you richer notifications and better performance features like parallelism and dependency caching out of the box. With GitHub Actions you would have to implement this on your own.

However, I think Actions is perfectly fine for smaller projects.

Action 2: Run php insights

Our next Action is going to run phpinsights on each push. As we don’t have to set up a database or PHP extensions to run insights we can use an existing Action. As no phpinsights Actions existed when I wrote this article, I wrote my own: phpinsights-action and laravel-phpinsights-action.

As we’re working with a Laravel app, we’re going to use the Laravel version of the Action.

workflow "phpunit / phpinsights / php-cs-fixer" { on = "push" resolves = [ "phpunit", "phpinsights" ] } action "phpinsights" { needs = ["composer install"] uses = "stefanzweifel/laravel-phpinsights-action@v1.0.0" args = "-v --min-quality=80 --min-complexity=80 --min-architecture=80 --min-style=80" }

As you can see, I’m passing the --min-xxx arguments to the Action. If the code quality of my app would drop in a future Pull Request, the Action would return a "failed" status code which in turn would mark the Pull Request as failed.

To see the reported issues of phpinsights you can open the log on github.com

Screenshot of PHP Insights output
Screenshot of PHP Insights output

Action 3: Run php-cs-fixer

Another common use case to use a CI service, is to check if your code follows the code convention and style guide you or your team has defined.

This can be accomplished by running an existing php-cs-fixer Action developed by Oskar Stark.

workflow "phpunit / phpinsights / php-cs-fixer" { on = "push" resolves = [ "phpunit", "phpinsights", "php-cs-fixer", ] } action "php-cs-fixer" { uses = "docker://oskarstark/php-cs-fixer-ga" }

Here we’ve used another way to define an Action: By using the docker:// protocol, you can directly point to an image on DockerHub.

The Action uses your existing .php_cs configuration and runs php-cs-fixer on your project. However, the Action always returns a "successful" status code. It doesn’t matter if violations happen.

Screenshot of Output of php-cs-fixer Action
Screenshot of Output of php-cs-fixer Action

But if violations would happen, php-cs-fixer automatically fixes them. So wouldn’t it be cool if the fixed files would automatically be committed and pushed back to your branch? 🤔

Action 3 Bonus: Commit fixed files

As I said in the beginning of this article, the underlying file system is shared between multiple Actions in a Workflow. So committing the fixed files is just "an Action away".

I’ve created a git-auto-commit-Action which commits all changed files and pushes the commit back to the repository.

Our updated main.workflow file now looks like this.

workflow "phpunit / phpinsights / php-cs-fixer" { on = "push" resolves = [ "phpunit", "phpinsights", "auto-commit-php-cs-fixer", ] } action "php-cs-fixer" { uses = "docker://oskarstark/php-cs-fixer-ga" } action "auto-commit-php-cs-fixer" { needs = ["php-cs-fixer"] uses = "stefanzweifel/git-auto-commit-action@v1.0.0" secrets = ["GITHUB_TOKEN"] env = { COMMIT_MESSAGE = "Apply php-cs-fixer changes" COMMIT_AUTHOR_EMAIL = "john.doe@example.com" COMMIT_AUTHOR_NAME = "John Doe" } }

Now on every push, possible style changes are automatically committed and pushed back to your repository. No need to run the command manually or for a Third-Party-service.

Screenshot of auto-commit. Two committers have been attributed with the commit.
Screenshot of auto-commit. Two committers have been attributed with the commit.

We now also added our first secret: GITHUB_TOKEN. This is a special secret which is available to all Actions in a repository. But it’s not enabled by default. You have to add it manually in the visual editor on github.com

Add the GITHUB_TOKEN by checking the corresponding checkbox in the editor.
Add the GITHUB_TOKEN by checking the corresponding checkbox in the editor.

As this Action uses the GITHUB_TOKEN to authenticate the git push-command, GitHub won’t trigger a second run of the Workflow. (Keep that in mind!)

Outlook

I think this covers the basics of GitHub Actions for PHP developers. I’m very excited about Actions and what the future holds. I hope the feature leaves the beta soon, so that more people can use it in their projects.

GitHub already announced support for Scheduled Workflows which opens a big realm of possibilities. (Jason Etcovitch writes here how he uses Scheduled Workflows to automatically create weekly meeting notes). Or for example, you could also build an Uptime-Monitoring Action which is triggered every few minutes and would send Slack or Email notifications.

Personally, I would like to use the Scheduling feature to fully automate Laravel Download Statistics (a side project of mine). A workflow could trigger the update of download numbers, create a new HTML export and push everything to Netlify.
No need for humans any more 🤖🤯.

Start developing your own Actions

If I could excite you about Actions and you want to start developing your own Actions, here are a few resources I’ve found while working on this article:

And here’s a list of Actions I found, which I actually would like to implement in my projects.

Acknowledgments

Thanks to Peter Brinck, Célien Boillat and Max Almonte for proof reading this article and giving feedback.

via Laravel News Links
GitHub Actions for PHP Developers

We sell our pizzas for $16.50. Here’s how the costs break down.

Photo: shironosov (iStock)
FeaturesStories from The Takeout about food, drink, and how we live.  

The day my boyfriend put giardiniera on a pizza changed my life forever. That boyfriend, now my husband, is a third-generation pizza maker, and his suburban Chicago family pizzeria has served tavern-style pies and other Chicago-style Italian fare since the 1950s. As much as I’d love to shout out the pizzeria’s name, you’ll soon see why I’m going to keep that detail to myself.

Being married to pizza has many perks, undeniably. But as a pizza aficionado since I could eat solid food, it’s eye-opening to experience owning a pizza restaurant. I’m now immersed in the world of independent restaurants, for all its joys and heartaches. My husband grew up in an entire family that ran pizzerias, so treating a Tuesday afternoon (the slowest business day of the week) like most people’s Saturday is his norm. But for me, the responsibilities of owning a small business have been a huge adjustment.

There are certainly bright sides to owning a pizzeria—yes, I mean free pizza. But, there are downsides, too. Many. My family doesn’t get many holidays; we work Christmas Eve, Mother’s Day, New Year’s Eve and/or Day, Valentine’s Day. Those are big business occasions for the catering side of our mom-and-pop, and those are the days that the regular, already intense hustle goes into overdrive.

Restaurant owners like my husband are always working. Even if he has a day or evening off, something can come up—an employee not showing up to work or the pizza oven breaking in the middle of the Saturday-night rush. It’s a life of early mornings and late nights, whether to meet a contractor to repair an appliance or to organize and deep clean the restaurant.

But one of the more frustrating economic aspects of owning a pizzeria is competing with fast-food pizza chains. Fast-food pizza franchises have Super Bowl-sized advertising budgets, the support of a corporate force behind them, and national name recognition. When their pizza ovens break, there’s protocol. They also notoriously sell cheap pies made with mass-produced ingredients, whereas independent pizzerias like ours stake our reputations on high-quality, homemade pies at higher—but fair—prices.

So why do our pies cost more? What are you buying when you order pizza from us? Let’s break down the costs that go into our pizzeria’s 15-inch cheese pizza (large), which we sell for $16.50.

Photo: danilsnegmb (iStock), Graphic: Kevin Pang

Ingredients and materials: $3.91

A simple cheese pizza in an independent pizzeria can be anything but simple to make—especially when you’re making the dough and sauce from scratch, and using high-quality, gooey cheese. According to Pizza Today, cheese prices are set by the “open outcry spot market” in a 10-minute daily trading session at the Chicago Mercantile Exchange. As I write this, the price for whole milk mozzarella is $2.08 per pound (we figure half a pound of cheese for each large pizza—$1.04). Wholesale supplies are cost-saving, especially when you’re ordering tomato paste and puree by the case, but together, all the ingredients that go into homemade pizza sauce can cost $1.99 per pizza. Throw in the flour and other ingredients used in the homemade dough, which amount to 45 cents, as well as 30 cents for a pizza box and 13 cents for a pizza circle.

Labor: $1.60

Minimum wage is a hot topic these days, and varies based on geography. At our pizzeria, the average hourly rate for employees is about $12 an hour. If you calculate the time it takes for one phone operator/cashier to take your order, one cook to make the pizza (roll the dough, add the sauce and cheese, slide it into the oven), cut it, box it and set aside for pickup or delivery, it is roughly 8-10 minutes in total—excluding cooking time.

Rent, Utilities, Odds/Ends: $5.65

Our suburban space is leased, so while the rent is a significant chunk of money per month, it is most likely lower than, say, a pizzeria in downtown Chicago. But adding up the cost for rent, heat, electricity, water, phone and internet services, the point-of-sale system and the loan repayment (which secured items like the pizza oven, dough mixer, prep tables, refrigerators, pizza rollers and other tools), plus periodic orders and maintenance of odds and ends like kitchen mats, menus, towels and aprons and cleaning supplies, it’s not insignificant.


The overhead cost for one large cheese pizza at our independent pizzeria is $11.16, which leaves a $5.34 profit. Keep in mind, though, the one variable a restaurant owner can’t control is how many orders they will get in a given day. It’s the make-or-break factor for any independent restaurant. Regardless of the foot traffic, online orders or calls, owners are still paying hourly wages, utilities and rent for the minutes that tick by without orders ringing in.

Growing up, I would visit family-owned restaurants across Chicago and dream of how cool it would be to own a restaurant and have complete creative control, feeding happy people, and hanging out at a place with unlimited soda.

From working nights, weekends and holidays to finding good help to customer complaints, the business can be a lot less glamorous than most people realize. Sure, the unlimited soda helps, but the job is not clock-in, clock-out. The first few years are a complete struggle, and we often refer to the pizzeria as a baby, because it always comes first. It requires a lot of care and attention, it teaches you a lot and can tear you down before bringing you back up. I don’t expect you to think about all that struggle—or math—while you’re eating our pizza, though. I just hope you think it’s the best pizza you’ve ever tasted, and well worth the money.

via Lifehacker
We sell our pizzas for $16.50. Here’s how the costs break down.