Blending with Fire

https://theawesomer.com/photos/2023/09/blending_with_fire_t.jpg

Blending with Fire

Link

The guys from How Ridiculous aren’t done destroying things in their gigantic blender. This time, there’s even more spectacle as they satisfy the appliance’s voracious appetite with aerosol deodorant cans, a garbage bin filled with dry ice, and glow sticks. They also fed it a combination of hairspray and sparklers, resulting in some impressive fireballs.

The Awesomer

3 crucial Laravel architecture best practices for 2023

https://life-long-bunny.fra1.digitaloceanspaces.com/media-library/production/56/conversions/programmer_v_02_o9k1tl-optimized.jpg

3 crucial Laravel architecture best practices for 2023

Updated on

Table of contents:

3 crucial Laravel architecture best practices for 2023

How should you organize your Laravel app to best serve your needs? Well, the good news is that you don’t have to worry about this since you are using a framework! Stick to the defaults unless you have good and objective reasons to do otherwise.

And yet, people can’t stop overthinking the architecture of their projects.

To me, it seems that the urge to deviate from the standard project structure often reveals a deeper issue – a fundamental inability to maintain organization. Whether you adhere to the Laravel architecture or significantly modify it, the outcome is likely to be disorganized.

Therefore, to address this problem, we will put ourselves in shoes that would fit almost any enterprise project.

Before we begin, though, let’s define what an “enterprise project” is in our context. Essentially, it’s a public facing project with lots of users that generates revenue, making it vital to continuously evolve by adapting to new technologies, business requirements, and market trends.

Here’s what is expected from the team of such projects:

  1. Easy collaboration.
  2. Maximize compatibility with third-party solutions that will help maintain the cost of development down.
  3. Keep the cost of onboarding low. To achieve this, new hires need to easily find their way around the codebase, which can make them somewhat productive even when they lack domain knowledge.

With these goals in mind, let’s dive into what I, and most of the experts from the community, think are the best architecture practices.

Using Laravel is meant to make your life easier, not harder.

  1. First, following conventions helps ensure that new hires can quickly find everything they need and start being productive as soon as possible. Laravel is a popular framework, and most developers will already be familiar with its default folder structure. By sticking to this, you help minimize the learning curve for new team members.
  2. Also, a profitable project is supposed to last for many years. People come and go. You will likely move on to something else. Why wouldn’t you make it easy for the ones who will take over?
  3. Additionally, by following the framework’s defaults, you ensure compatibility with many first and third-party packages. This can be crucial for maintaining development costs down and maximizing the use of available resources.

While it’s essential to keep the default folder structure, it’s also necessary to organize your code in a way that makes sense for your project. One way to do this is by organizing it by domain, without breaking the default folder structure.

This means that, for example, inside your Models folder, you could create a Blog folder. This way, when using the php artisan make:model Blog/Category command, the new file will be created at the right place.

This approach can also be used for controllers, middlewares, policies, and so on. Organizing your code the intended way will help you maintain a compatible, clean and intuitive codebase.

Developers love discovering new ways of doing things, and it’s always tempting to experiment with new packages or approaches. This is fine for personal projects or when you are working alone, but it may not be ideal in a team setting.

When you hire Laravel developers, you are hiring them to expand and maintain your product using Laravel. It’s essential to remember this and stick to the built-in features of Laravel whenever possible.

For example, don’t use Data Transfer Objects (DTOs) instead of custom form requests unless there are good and objective reasons to do so. Using the built-in features of Laravel ensures that all developers on your team are working with the same set of tools and reduces the learning curve for new hires.

Matt Stauffer, who has a lot of experience building apps for enterprise as the CEO of Tighten, talks about how keeping things simple benefits big projects.

James Brooks is a core Laravel team member. He knows what working with a big team and a big codebase are. He also asked me to include it in this article, so there he is!

https://twitter.com/jbrooksuk/status/1697182125663945015?s=20

Sebastian Schlein is the co-founder of BeyondCo, a company deeply involved with Laravel, and he also thinks that you should stick to the framework’s defaults. This is a tweet from 2019 by the way.

https://twitter.com/seb_sebsn/status/1186228940555345921

Jason McCreary, from Laravel Shift, also showcases his favorite way of organizing Laravel projects. Looks familiar, don’t you think?

https://twitter.com/gonedark/status/1333474208123412488

All that being said, at the end, results matter the most. Here’s a tweet from Taylor Otwell himself about keeping an open mind:

https://twitter.com/taylorotwell/status/1668580181504606208

Laravel News Links

Create your own GitHub Actions using Fly Machines

https://fly.io/laravel-bytes/ad-hoc-tasks/assets/on-demand-cover.webp

Machines directing other machines tasks

What if you could spin up a VM, do a task, and tear it all down super easily? That’d be neat, right? Let’s see how to spin up Laravel application on Fly.io and run some ad-hoc tasks!

We’re going to create a setup where we can instruct Fly.io to spin up a VM and run some code. The code will read an instructions.yaml file and follow its instructions. The neat part: We can create the YAML file on-the-fly when we spin up the VM.

This lets us create one standard VM that can do just about any work we need. I structured the YAML a bit like GitHub Actions.

In fact, I built a similar thing before:

It uses Golang, but here we’ll use Laravel 🐐.

Here’s a repository with the code discussed.

The Code

The code is pretty straight forward (partially because I ask you to draw the rest of the owl). Within a Laravel app, we’ll create a console command that does the work we want.

Here’s what I did to spin up a new project and add a console command to it:

composer create-project laravel/laravel ad-hoc-yaml
cd ad-hoc-yaml

composer require symfony/yaml

php artisan make:command --command="ad-hoc" AdHocComputeCommand

We need to parse some YAML, so I also included the symfony/yaml package.

And the command itself is quite simple 🦉:

<?php

namespace App\Console\Commands;

use Symfony\Component\Yaml\Yaml;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\Log;

class AdHocComputeCommand extends Command
{
    protected $signature = 'ad-hoc';

    protected $description = 'Blindly follow some random YAML';

    public function handle()
    {
        try {
            $instructions = Yaml::parseFile("/opt/instructions.yaml");
        } catch(\Exception $e) {
            Log::error($e);
            $this->error($e->getMessage());
            return SELF::FAILURE;
        }


        foreach($instructions['steps'] as $step) {
            // draw the rest of the <bleep>ing owl
        }

        return SELF::SUCCESS;
    }
}

For a slightly-more-fleshed out version of this code, see this command class here.

Package it Up

Fly uses Docker images to create (real, non-container) VMs. Let’s package our app into a Docker image that we can deploy to Fly.io.

We’ll borrow the base image Fly.io uses to run Laravel apps (fideloper/fly-laravel:${PHP_VERSION}). We’ll just assume PHP 8.2:

FROM fideloper/fly-laravel:8.2

COPY . /var/www/html

RUN composer install --optimize-autoloader --no-dev

CMD ["php", "/var/www/html/artisan", "ad-hoc"]

Once that’s created, we can build that Docker image and push it up to Fly.io. One thing to know here: You can only push images to Fly.io’s registry for a specific app. The image name to use must correspond to an existing app, e.g. docker push registry.fly.io/some-app:latest. You can use any tags (e.g. latest or 1.0) as well as push multiple tags.

So, to push an image up to Fly.io, we’ll first create an app (an app is just a thing that houses VMs) and authenticate Docker against the Fly Registry. Then, when we spin up a new VM, we’ll use that image that’s already in the registry.

This is different from using fly deploy... which builds an image during deployment, and is meant more for hosting your web application. Here we’re more using Fly.io for specific tasks rather than hosting a whole application.

The following shows creating an app, building the Docker image, and pushing it up to the Fly Registry:

APP_NAME="my-adhoc-puter"

# Create an app
fly apps create $APP_NAME

# Build the docker image
docker build \
    -t registry.fly.io/$APP_NAME \
    .

# Authenticate with the Fly Registry
fly auth docker

# Push the docker image to the Fly Registry
# so we can use it when creating a new VM
docker push registry.fly.io/$APP_NAME

This article is great at explaining the fun things you can do with the Fly Registry.

We make 2 assumptions here:

  1. You have Docker locally
  2. You’re on an Intel-based CPU

Pro tip: If you’re on a ARM based machine (M1/M2 Macs), you can actually VPN into your Fly.io private network and use your Docker builder (all accounts have a Docker builder, used for deploys) via DOCKER_HOST=<ipv6 of builder machine> docker build ....

Fly.io ❤️ Laravel

Fly your servers close to your users—and marvel at the speed of close proximity. Deploy globally on Fly in minutes!


Deploy your Laravel app!  

Run It

To run our machine, all we need is a YAML file and to make an API call to Fly.io.

As mentioned before, Machines (VMs) spun up via API call let you create files on-the-fly! What you do is provide the file name and the base64’ed file contents. The file will be created on the Machine VM before it runs your stuff.

Here’s what the code to make such an API request would look like within PHP / Laravel:

# Some YAML, simliar to GitHub Actions
$rawYaml = '
name: "Test Run"

steps:
  - name: "Print JSON Payload"
    uses: hookflow/print-payload
    with:
      path: /opt/payload.json

  - name: "Print current directory"
    run: "ls -lah $(pwd)"

  - run: "echo foo"

  - uses: hookflow/s3
    with:
      src: "/opt/payload.json"
      bucket: "some-bucket"
      key: "payload.json"
      dry_run: true
    env:
      AWS_ACCESS_KEY_ID: "abckey"
      AWS_SECRET_ACCESS_KEY: "xyzkey"
      AWS_REGION: "us-east-2"
';

$encodedYaml = base64_encode($rawYaml);

# Some random JSON payload that our YAML
# above references
$somePayload = '
{
    "data": {
        "event": "foo-happened",
        "customer": "cs_1234",
        "amount": 1234.56,
        "note": "we in it now!"
    },
    "pages": 1,
    "links": {"next_page": "https://next-page.com/foo", "prev_page": "https://next-page.com/actually-previous-page"}
}
';

$encodedPayload = base64_encode($somePayload);

# Create the payload for our API call te Fly Machines API
$appName = 'my-adhoc-puter';
$requestPayload = json_decode(sprintf('{
    "region": "bos",
    "config": {
        "image": "registry.fly.io/%s:latest",
        "guest": {"cpus": 2, "memory_mb": 2048,"cpu_kind": "shared"},
        "auto_destroy": true,
        "processes": [
            {"cmd": ["php", "/var/www/html/artisan", "ad-hoc"]}
        ],
        "files": [
            {
                "guest_path": "/opt/payload.json",
                "raw_value": "%s"
            },
            {
                "guest_path": "/opt/instructions.yaml",
                "raw_value": "%s"
            }
        ]
    }
}
', $appName, $encodedPayload, $encodedYaml));

// todo 🦉: create config/fly.php 
// and set token to ENV('FLY_TOKEN');
$flyAuthToken = config('fly.token');

use Illuminate\Support\Facades\Http;

Http::asJson()
    ->acceptJson()
    ->withToken($flyAuthToken)
    ->post(
        "https://api.machines.dev/v1/apps/${appName}/machines", 
        $requestPayload
    );

I created an artisan command that does that work here.

In your case, you might want to trigger this in your own code whenever you want some work to be done.

After you run some tasks, you should see the Machine VM spin up and do its work! Two things to make this more fun:

  1. Liberally use Log::info() in your code so Fly’s logs can capture what’s going on (helpful for debugging)
  2. Set your Logger to use the stderr logger so Fly’s logging mechanism can get the log output

Assuming that’s setup, you can then run fly logs -a <app-name> to see the log output as the Machine VM boots up, runs your code, and then stops.

Laravel News Links

Use several databases within your Laravel project

https://capsules.codes/storage/canvas/images/himICQw4vdilQhOi08bKqtpSTwFVabBiYq6NyevB.jpg

TL;DR: How to use multiple databases within your Laravel project and manage database separated records.

You can find a sample Laravel Project on our Github Repository.

In an effort to maintain clarity for each of my projects, I separate my databases based on the role they play. This blog, for instance, includes several databases: one specifically for the blog and another for analytics. This article explains how to go about it.

A new Laravel project already contains, in its .env file, information related to the database, including the default mysql connection. We’ll be working with two databases: one and two. There will also be a connection to one [ optional ].

.env

Before

DB_CONNECTION=mysql
DB_HOST=127.0.0.1
DB_PORT=3306
DB_DATABASE=<database-name>
DB_USERNAME=
DB_PASSWORD=

After

DB_CONNECTION=one

DB_ONE_HOST=127.0.0.1
DB_ONE_PORT=3306
DB_ONE_DATABASE=one
DB_ONE_USERNAME=
DB_ONE_PASSWORD=

DB_TWO_HOST=127.0.0.1
DB_TWO_PORT=3306
DB_TWO_DATABASE=two
DB_TWO_USERNAME=
DB_TWO_PASSWORD=

The default .env file informations is reflected in the database.php configuration file.

config/database.php

'connections' => [

		'mysql' => [
			  'driver' => 'mysql',
			  'url' => env('DATABASE_URL'),
			  'host' => env('DB_HOST', '127.0.0.1'),
			  'port' => env('DB_PORT', '3306'),
			  'database' => env('DB_DATABASE', 'forge'),
			  'username' => env('DB_USERNAME', 'forge'),
			  'password' => env('DB_PASSWORD', ''),
			  'unix_socket' => env('DB_SOCKET', ''),
			  'charset' => 'utf8mb4',
			  'collation' => 'utf8mb4_unicode_ci',
			  'prefix' => '',
			  'prefix_indexes' => true,
			  'strict' => true,
			  'engine' => null,
			  'options' => extension_loaded('pdo_mysql') ? array_filter([
			      PDO::MYSQL_ATTR_SSL_CA => env('MYSQL_ATTR_SSL_CA'),
			  ]) : [],
		],
    ...
]

We’ll duplicate this connection information as many times as there are connections.

'connections' => [

        'one' => [
            'driver' => 'mysql',
            'url' => env('DATABASE_URL'),
            'host' => env('DB_ONE_HOST', '127.0.0.1'),
            'port' => env('DB_ONE_PORT', '3306'),
            'database' => env('DB_ONE_DATABASE', 'forge'),
            'username' => env('DB_ONE_USERNAME', 'forge'),
            'password' => env('DB_ONE_PASSWORD', ''),
            'unix_socket' => env('DB_ONE_SOCKET', ''),
            'charset' => 'utf8mb4',
            'collation' => 'utf8mb4_unicode_ci',
            'prefix' => '',
            'prefix_indexes' => true,
            'strict' => true,
            'engine' => null,
            'options' => extension_loaded('pdo_mysql') ? array_filter([
                PDO::MYSQL_ATTR_SSL_CA => env('MYSQL_ATTR_SSL_CA'),
            ]) : [],
        ],

        'two' => [
            'driver' => 'mysql',
            'url' => env('DATABASE_URL'),
            'host' => env('DB_TWO_HOST', '127.0.0.1'),
            'port' => env('DB_TWO_PORT', '3306'),
            'database' => env('DB_TWO_DATABASE', 'forge'),
            'username' => env('DB_TWO_USERNAME', 'forge'),
            'password' => env('DB_TWO_PASSWORD', ''),
            'unix_socket' => env('DB_TWO_SOCKET', ''),
            'charset' => 'utf8mb4',
            'collation' => 'utf8mb4_unicode_ci',
            'prefix' => '',
            'prefix_indexes' => true,
            'strict' => true,
            'engine' => null,
            'options' => extension_loaded('pdo_mysql') ? array_filter([
                PDO::MYSQL_ATTR_SSL_CA => env('MYSQL_ATTR_SSL_CA'),
            ]) : [],
        ],

Then, it is necessary to instruct the migrations to migrate to the different databases created:

  • one2023_08_31_000000_create_foos_table.php
  • two2023_08_31_000001_create_bars_table.php

The static function connection('<connection-name>') of the Schema Facade allows for this, which we add in the up() and down() functions.

2023_08_31_000000_create_foos_table.php

<?php

use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;

return new class extends Migration
{
    public function up() : void
    {
        Schema::connection( 'one' )->create( 'foos', function( Blueprint $table )
        {
            $table->id();
            $table->timestamps();
        });
    }

    public function down() : void
    {
        Schema::connection( 'one' )->dropIfExists( 'foos' );
    }
};

2023_08_31_000001_create_bars_table.php

<?php

use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;

return new class extends Migration
{
    public function up() : void
    {
        Schema::connection( 'two' )->create( 'bars', function( Blueprint $table )
        {
            $table->id();
            $table->timestamps();
        });
    }

    public function down() : void
    {
        Schema::connection( 'two' )->dropIfExists( 'bars' );
    }
};

Next, the models related to the migrations need to be modified to indicate their connection with the database via the $connection attribute.

App\Models\Foo.php

 <?php

namespace App\Models;

use Illuminate\Database\Eloquent\Model;

class Foo extends Model
{
	 protected $connection = 'one';
}

App\Models\Bar.php

 <?php

namespace App\Models;

use Illuminate\Database\Eloquent\Model;

class Bar extends Model
{
	 protected $connection = 'two';
}

We can now launch the migration php artisan migrate. By default, this command uses the value given by DB_CONNECTION. If it’s not defined in the .env file, then it has to be indicated in the command php artisan migrate --database=one.

In order to test the functionality, we can quickly implement an anonymous function when calling the main route.

web.php

<?php

use Illuminate\Support\Facades\Route;
use App\Models\Foo;
use App\Models\Bar;

Route::get( '/', function()
{
    $foo = Foo::create();
    $bar = Bar::create();

    dd( $foo, $bar );
});

The values are then created in the respective databases and visible in the browser.

In case a database refresh is needed using the command php artisan migrate:fresh, it’s worth noting that only the default database, i.e. the one specified by DB_CONNECTION, will be refreshed. Unfortunately, Laravel does not yet support the refreshing of multiple databases at the same time.

To refresh a database that is not the default one, it is necessary to use the command php artisan db:wipe --database=<database-name>. This command can be repeated for each additional database. Once all databases have been properly wiped with db:wipe, you can then proceed without errors with php artisan migrate:fresh.

You can also develop your own command that would automate the various tasks needed to clean your database.

Glad this helped.

Laravel News Links

Generate Laravel Factories using ChatGPT

https://res.cloudinary.com/benjamin-crozat/image/upload/v1693318154/chatgpt-code-generation_ily1el.png

How to generate Laravel Factories using ChatGPT

Updated on

Table of contents:

How to generate Laravel Factories using ChatGPT

Generating quality code using a Large Language Model such as GPT requires a basic understand of the technology. And you can quickly learn about it here: How do language-based AIs, such as GPT, work?

That being said, you could also follow this tutorial, copy and paste my prompts, and be done with it!

Before I forget, I recommend using GPT-4 for better results, as it’s way smarter than GPT-3.5. Also, remember there’s a lot of randomness and consistency accross prompts cannot not be ensured. That being said, the time you save will make up for it!

So, what problem are we trying to solve here?

During my freelance career, I stumbled upon a lot of codebases that weren’t leveraging Laravel Factories at all. That’s a bummer, because they can help you:

  1. Write tests with randomized inputs for your code.
  2. Set up a good local environment filled with generated data.

In a big codebase, there may be dozens of models, and writing factories for each of them all by yourself could take days of hard work.

Unless we leverage the power of AI, right?

By asking ChatGPT to think step by step and detail its reasoning, we can ensure better quality answers. But first, the requirements:

  1. The model’s table schema.
  2. The model’s code.
The model's table schema: <the model's table schema>

The model's code: <the model's code>

Goal: Use the information above to generate a Laravel Factory.

Instructions:
* Don't include attributes that are automatically handled by Laravel.
* Faker no longer recommends calling properties. Instead, call methods. For instance, "$this->faker->paragraph" becomes "$this->faker->paragraph()".
* Include a method for each many-to-many relationship using factory callbacks.

Review each of my instructions and explain step by step how you will proceed in an existing Laravel installation without using Artisan. Then, show me the result.

OpenAI enabled ChatGPT users to share their conversations with GPT publicly in read-only mode, which is a great way to share my experiments with you.

See what a Laravel Factory generated by ChatGPT looks like.

Laravel News Links

Multiple OpenAi Functions PHP / Laravel

https://miro.medium.com/v2/resize:fit:1200/1*X4NgzhgmPtOpdDdPPBiDlw.pngThis article will hopefully help you to understand how to build a system that can work with multiple OpenAi API function calls!Laravel News Links

Multiple OpenAi Functions PHP / Laravel

https://miro.medium.com/v2/resize:fit:1200/1*X4NgzhgmPtOpdDdPPBiDlw.pngThis article will hopefully help you to understand how to build a system that can work with multiple OpenAi API function calls!Laravel News Links

Multiple OpenAi Functions PHP / Laravel

https://miro.medium.com/v2/resize:fit:1200/1*X4NgzhgmPtOpdDdPPBiDlw.pngThis article will hopefully help you to understand how to build a system that can work with multiple OpenAi API function calls!Laravel News Links