A Green Beret Shows You How to Shoot a Pistol…in 8 Minutes

Learn how to shoot a pistol…fast. In this video, James Reeves trains with experienced Green Beret, Jimmy Cannon alongside @daniellevalkyrie in Austria. This video offers a unique opportunity to learn fundamental skills in pistol handling, all presented in a concise, easy-to-follow format. Whether you’re a beginner seeking a strong foundation or a seasoned enthusiast looking […]

Read More …

The post A Green Beret Shows You How to Shoot a Pistol…in 8 Minutes appeared first on The Firearm Blog.

The Firearm Blog

Agent Orange damages the brain like Alzheimer’s

https://www.futurity.org/wp/wp-content/uploads/2024/02/agent-orange-brain-damage-alzheimers-disease-1600.jpgAn orange cloud in front of trees in the background.

Exposure to Agent Orange damages brain tissue in ways similar to Alzheimer’s disease, according to a new study.

Agent Orange, an herbicide used during the Vietnam War, is a known toxin with wide-ranging health effects.

Even though it has not been used for decades, there is increasing interest in its effects on the brain health of aging veterans.

The new study reveals the mechanisms by which Agent Orange affects the brain and how those processes can lead to neurodegenerative diseases.

“These chemicals don’t just affect veterans; they affect our entire population.”

The research shows that exposures to Agent Orange herbicidal chemicals damage frontal lobe brain tissue of laboratory rats with molecular and biochemical abnormalities that are similar to those found in early-stage Alzheimer’s disease.

The findings, published in in the Journal of Alzheimer’s Disease, could have important implications for military veterans who were exposed to Agent Orange during the Vietnam War, says study author Suzanne M. De La Monte, a professor of pathology and laboratory medicine and neurosurgery at Brown University’s Warren Alpert Medical School.

“If we can show that prior exposure to Agent Orange leads to subsequent neurodegenerative disease, then that gives veterans a chance to get help,” De La Monte says.

But the study’s findings have much broader significance, she adds, because the toxins in Agent Orange are also present in lawn fertilizers.

“These chemicals don’t just affect veterans; they affect our entire population,” De La Monte says.

Agent Orange and Alzheimer’s

Agent Orange is a synthetic defoliating herbicide that was widely used between 1965 and 1970 during the Vietnam War. Members of the US military were exposed to the chemical when stationed close to enemy territory that had been sprayed by aircraft.

Government reports show that exposure to Agent Orange also caused birth defects and developmental disabilities in babies born to Vietnamese women residing in the affected areas. Over time, studies showed that exposure to Agent Orange was associated with an increased risk of some cancers as well as cardiovascular disease and diabetes.

Research also revealed associations between Agent Orange exposures and later development of nervous system degenerative diseases, and significantly higher rates and earlier onsets of dementia. However, in the absence of a proven causal link between Agent Orange and aging-associated diseases, there has been a need for studies that improve understanding of the process by which the herbicide affects the brain.

“Scientists realized that Agent Orange was a neurotoxin with potential long-term effects, but those weren’t shown in a clear way,” De La Monte says. “That’s what we were able to show with this study.”

The analysis was conducted by De La Monte and Ming Tong, a research associate in medicine at Brown; both are also associated with Rhode Island Hospital, an affiliate of the Warren Alpert Medical School. The research builds upon their recent studies of exposure to Agent Orange chemicals on immature human cells from the central nervous system showing that short-term exposure to Agent Orange has neurotoxic and early degenerative effects related to Alzheimer’s.

The researchers investigated the effects of the two main constituents of Agent Orange (2,4-dichlorophenoxyacetic acid and 2,4,5-trichlorophenoxyacetic acid) on markers of Alzheimer’s neurodegeneration using the samples from the frontal lobes of laboratory rats. The mature, intact brain tissue samples included a full complex array of cell types and tissue structures.

The scientists treated the samples to cumulative exposure to Agent Orange, as well as to its separate chemical constituents, and observed the underlying mechanisms and molecular changes.

They found that treatment with Agent Orange and its constituents caused changes in the brain tissue corresponding to brain cell degeneration, and molecular and biochemical abnormalities indicative of cytotoxic injury, DNA damage, and other issues.

These chemicals are ‘everywhere’

The approach used by the researchers helped them better characterize the neuropathological, neurotoxic and neurodegenerative consequences of Agent Orange toxin exposures in young, otherwise healthy brains, as would have been the case for Vietnam War-era military personnel and many local residents in Vietnam.

“Looking for the early effects tells us that there is a problem that is going to cause trouble later on and also gives us a grip on the mechanism by which the agent is causing trouble,” De La Monte says. “So if you were going to intervene, you would know to focus on that early effect, monitor it, and try to reverse it.”

Del La Monte hopes to be involved in additional research on human brain tissue to evaluate the long-term effects of Agent Orange exposures in relation to aging and progressive neurodegeneration in Vietnam War veterans.

The use of Agent Orange was prohibited by the US government in 1971. However, the chemicals remain in the environment for decades, De La Monte says. According to the study authors, the widespread, uncontrolled use of Agent Orange in herbicide and pesticide products is such that one in three Americans has biomarker evidence of prior exposure.

Despite growing recognition of the broad toxic and carcinogenic effects of 2,4-dichlorophenoxyacetic acid, the researchers note that concern has not achieved a level sufficient for federal agencies to ban its use.

The researchers conclude that the results of this study and another recent publication support the notion that Agent Orange as well as its independent constituents (2,4-dichlorophenoxyacetic acid and 2,4,5-trichlorophenoxyacetic acid) exert alarming adverse effects on the mature brain and central nervous system.

“That’s why it’s so important to look into the effects of these chemicals,” De La Monte says. “They are in the water; they are everywhere. We’ve all been exposed.”

The National Institute on Alcohol Abuse and Alcoholism at the National Institutes of Health supported the work.

Source: Brown University

The post Agent Orange damages the brain like Alzheimer’s appeared first on Futurity.

Futurity

Superman & Batman vs. Darth Vader

https://theawesomer.com/photos/2024/02/superman_batman_vader_t.jpg

Superman & Batman vs. Darth Vader

Link

The Dark Knight takes on the Dark Lord of the Sith in a quest to release his Justice League teammate from an Imperial prison cell. Batinthesun’s live-action fan film pulls out all the stops as Batman dons a lightsaber and other Wayne Enterprises gadgets on a quest to take down Darth Vader.

The Awesomer

11+ Laravel Tips: Optimize Database Queries (2024)

https://websolutionstuff.com/adminTheme/assets/img/11_laravel_tips_optimize_database_queries_2024.jpg

Hey developers! If you’re like me, constantly striving to make your Laravel applications faster and more efficient, you’re in for a treat. In this guide, I’m excited to share 11+ game-changing Laravel tips to supercharge your database queries as we step into 2024.

Database optimization doesn’t have to be a head-scratcher, and with these simple tips, we’ll explore ways to enhance your Laravel projects, ensuring they not only run smoothly but also deliver top-notch performance.

In this article, we’ll see 11+ laravel tips: optimize database queries (2024), the best 11 tips and tricks to improve database queries in laravel 8/9/10, and query optimization in laravel 2024.

Ready to dive into the world of optimized database queries? Let’s make our Laravel applications faster and more responsive together.

1. Minimizing Unnecessary Queries

Sometimes, we end up running database queries that aren’t really needed. Take a look at the example below.

<?php
 
class PostController extends Controller
{
    public function index()
    {
        $posts = Post::all();
        $private_posts = PrivatePost::all();
        return view('posts.index', ['posts' => $posts, 'private_posts' => $private_posts ]);
    }
}

The provided code fetches rows from two distinct tables (e.g., "posts" and "private_posts") and then sends them to a view. Take a peek at the corresponding view file presented below.

// posts/index.blade.php
 
@if( request()->user()->isAdmin() )
    <h2>Private Posts</h2>
    <ul>
        @foreach($private_posts as $post)
            <li>
                <h3></h3>
                <p>Published At: </p>
            </li>
        @endforeach
    </ul>
@endif
 
<h2>Posts</h2>
<ul>
    @foreach($posts as $post)
        <li>
            <h3></h3>
            <p>Published At: </p>
        </li>
    @endforeach
</ul>

As you can see above, $private_posts is visible to only a user who is an admin. Rest all the users cannot see these posts.

We can modify our logic below to avoid this extra query.

$posts = Post::all();
$private_posts = collect();
if( request()->user()->isAdmin() ){
    $private_posts = PrivatePost::all();
}

 

2. Consolidate Similar Queries for Improved Efficiency

Sometimes, we find ourselves needing to create queries to fetch various types of rows from a single table.

$published_posts = Post::where('status','=','published')->get();
$featured_posts = Post::where('status','=','featured')->get();
$scheduled_posts = Post::where('status','=','scheduled')->get();

Instead of this 3 different queries:

$posts =  Post::whereIn('status',['published', 'featured', 'scheduled'])->get();
$published_posts = $posts->where('status','=','published');
$featured_posts = $posts->where('status','=','featured');
$scheduled_posts = $posts->where('status','=','scheduled');

 

 

3. Optimizing Performance: Adding Index to Frequently Queried Columns

When you’re filtering queries using a condition on a text-based column, it’s a smart move to slap an index on that column. Why? Because adding an index makes your queries way speedier when sifting through rows.

Think of it like a well-organized filing system – it just makes finding what you need a whole lot faster!

$posts = Post::where('status','=','published')->get();

In the example above, we’re fetching records based on a condition added to the "status" column. To boost the query’s performance, consider enhancing it with the following database migration.

Schema::table('posts', function (Blueprint $table) {
   $table->index('status');
});

 

4. Optimize Pagination: Switch to simplePaginate Over Paginate

When it comes to paginating results, our typical approach would be:

$posts = Post::paginate(10);

When using pagination in Laravel, the typical approach involves two queries: one to retrieve paginated results and another to count the total number of rows in the table. Counting rows can be slow and impact query performance.

But why does Laravel count the total number of rows?

It does so to generate pagination links. By knowing the total number of pages beforehand, along with the current page number, Laravel facilitates easy navigation. You can jump to any page with confidence.

On the flip side, using simplePaginate skips the total row count, making the query faster. However, you sacrifice the knowledge of the last page number and the ability to jump to specific pages.

For large database tables, favor simplePaginate over paginate can significantly improve performance.

$posts = Post::paginate(20); // Generates pagination links for all the pages

$posts = Post::simplePaginate(20); // Generates only next and previous pagination links

 

5. Optimizing Database Queries: Avoiding Leading Wildcards with the LIKE Keyword

When aiming to retrieve results that match a particular pattern, our usual go-to approach is to use:

select * from table_name where column like %keyword%

The previous query scans the entire table, which can be inefficient. If we’re aware that the keyword appears at the start of the column value, a more efficient query can be formulated as follows:

select * from table_name where column like keyword%

 

6. Optimizing WHERE Clauses: Minimizing the Use of SQL Functions

It’s advisable to steer clear of using SQL functions in the WHERE clause, as they can lead to a full table scan. Take a peek at the example below: when querying results based on a specific date, the typical approach involves:

$posts = POST::whereDate('created_at', '>=', now() )->get();

This will result in a query similar to below.

select * from posts where date(created_at) >= 'timestamp-here'

The initial query causes a full table scan because the where condition isn’t applied until the date function is evaluated.

To improve this, we can restructure the query to eliminate the need for the date SQL function, as shown below:

$posts = Post::where('created_at', '>=', now() )->get();
select * from posts where created_at >= 'timestamp-here'

 

7. Optimizing Table Structure: Minimizing the Addition of Excessive Columns

To enhance performance, it’s wise to keep the number of columns in a table to a minimum. In databases like MySQL, you can optimize by breaking down tables with numerous columns into multiple tables. These tables can then be linked using primary and foreign keys.

Including excessive columns in a table extends the length of each record, leading to slower table scans. This becomes evident when executing a "select *" query, as it fetches unnecessary columns, causing a slowdown in retrieval speed

 

8. Separating Columns with Text Data Type into Their Own Table

When dealing with tables that store substantial data, especially in columns like TEXT, it’s wise to consider separating them into their own table or into a less frequently accessed table.

This practice proves beneficial because columns with extensive information can significantly inflate the size of individual records, impacting query times.

For instance, picture a table named "posts" with a "content" column storing hefty blog post content. Given that this detailed content is typically required only when someone is viewing that specific blog post, extracting this column from the main "posts" table can dramatically enhance query performance, especially when dealing with a multitude of posts.

 

9. More Efficient Method for Retrieving the Latest Rows from a Table

When we aim to fetch the most recent rows from a table, our usual approach often involves the following:

$posts = Post::latest()->get();
// or $posts = Post::orderBy('created_at', 'desc')->get();

The above approach will produce the following SQL query.

select * from posts order by created_at desc

Instead of this, you can do like this:

$posts = Post::latest('id')->get();
// or $posts = Post::orderBy('id', 'desc')->get();
select * from posts order by id desc

 

10. Optimizing MySQL Inserts

So far, we’ve focused on making select queries faster for fetching data from a database. Usually, our attention revolves around optimizing read queries. Yet, there are instances where we need to speed up insert and update queries as well.

// Instead of inserting records one by one like this:
foreach ($data as $record) {
    DB::table('your_table')->insert($record);
}

// You can optimize it by using the insert method with an array of data like this:
DB::table('your_table')->insert($data);

 

11. Inspecting and Optimizing Queries

When it comes to optimizing queries in Laravel, there’s no one-size-fits-all solution. After all, who knows your application better than you do? Understanding its behavior, the number of queries it churns out, and which ones are necessary is key.

By inspecting these queries, you gain valuable insights and can work towards reducing their overall number.

To aid in this crucial task, several tools are available to help you scrutinize queries on every page.

However, a word of caution: refrain from running these tools in your production environment. Doing so might compromise your application’s performance and expose sensitive information to unauthorized users.

Here are a few tools to inspect and optimize your queries:

  1. Laravel Debugbar:

    • Laravel Debugbar features a handy "database" tab, revealing all executed queries when you navigate through your pages. Visit each page in your application to observe the queries in action.
  2. Clockwork:

    • Similar to Laravel Debugbar, Clockwork provides debug information. However, instead of injecting a toolbar into your website, it displays the details in the developer tools window or as a standalone UI accessible at yourappurl/clockwork.
  3. Laravel Telescope:

    • Laravel Telescope serves as an excellent debugging companion during local Laravel development. Once installed, access the dashboard by visiting yourappurl/telescope. Navigate to the "queries" tab to view and analyze all the queries executed by your application.

Remember, these tools are best suited for your development environment to fine-tune your queries without risking your production’s performance and security.

Happy optimizing!

 


You might also like:

Laravel News Links

Deadpool 3’s Trailer Is Here to Save the Marvel Cinematic Universe

https://i.kinja-img.com/image/upload/c_fill,h_675,pg_1,q_80,w_1200/e88bbe770aef551ed635c3449e39cb66.jpg

The first trailer for Deadpool 3 is here.
Image: Marvel Studios

Marvel Studios is only releasing one movie this year—but, from the looks of it, it’s going to be unforgettable. That movie, of course, is Deadpool & Wolverine, which brings the R-rated, fourth-wall-breaking hero from Fox’s X-Men Universe into the Marvel Cinematic Universe. Ryan Reynolds stars and, this time, he’s bringing along his friend Hugh Jackman as Wolverine.

io9 Interview: Oscar Isaac Was ‘All In’ on Moon Knight

Directed by Shawn Levy, Deadpool 3 is one of the most highly anticipated Marvel films in years and now the first trailer is here.

Filming only wrapped a few weeks ago, so the fact that we’re getting a trailer at all for this is pretty incredible. So, what do you think?

Deadpool 3 opens in theaters July 26.

[Editor’s Note: This article is part of the developing story. The information cited on this page may change as the breaking story unfolds. Our writers and editors will be updating this article continuously as new information is released. Please check this page again in a few minutes to see the latest updates to the story. Alternatively, consider bookmarking this page or sign up for our newsletter to get the most up-to-date information regarding this topic.]

Read more from io9:


Want more io9 news? Check out when to expect the latest Marvel, Star Wars, and Star Trek releases, what’s next for the DC Universe on film and TV, and everything you need to know about the future of Doctor Who.

Gizmodo

USS Texas: “The Most Gangsta Battleship Of All Time”

http://img.youtube.com/vi/3oJSRAFkJIs/0.jpg

The Fat Electrician has a tribute video for USS Texas (which is still undergoing refurbishment).

  • “Today we’re talking about the most gangsta battleship of all time: The USS Texas, predating both World Wars, being built in 1914.”
  • “It’s commonly referred to as the last dreadnought. But it’s not technically a dreadnought, belonging to the New York-class of battleswhips, which were commonly referred to as super-dreadnoughts.” It was a class of two, with only the New York and the Texas. There was a pre-dreadnought USS Texas laid down in 1889 and scrapped in 1911.
  • “They had the largest guns ever put on a boat up to that time. That would be the Mark 1, capable of launching two 14 inch shells that weighed nearly 1,600 pounds apiece. The USS Texas had five of them, two in the front and three in the back. It was like a freedom sedan.”
  • They also had ballistic calculators and analogue computers, making them the most accurate naval guns in the world at the time. Plus a whole bunch of smaller guns.
  • “It was the first ship in history to incorporate anti-aircraft guns.”
  • “As well as having a 12″ thick hull, an entire freedom foot of Pittsburgh steel. The only thing millimeters is going to do to that is scratch the paint.”
  • It was the first ship to have a compliment of Marines onboard. “They let the water grunts drive the biggest gun ever made.”
  • The USS Texas saw “almost no combat” in World War I. Via Wikipedia: “Texas’s service with the Grand Fleet consisted entirely of convoy missions and occasional forays to reinforce the British squadron on blockade duty in the North Sea whenever German heavy units threatened.”
  • “But it’s actions in World War II made it a naval legend.” Lots of newer, more powerful ships than the Texas, but the Texas was the only battleship to engage the enemy in all five theaters.
  • “D-Day, June 6, 1944. The Texas would take it’s position 12,000 yards off the coast of Normandy.” It fired 235 rounds at German fortifications in just under 54 minutes. “That is four hundred and eight thousand pounds of ammunition.”
  • “The Texan was shooting the enemy with about three spicy Volvos a minute.”
  • “I’m trying to tell you the Grim Yeeter over here bitchslapped the enemy’s coastline with an entire car dealership in the amount of time it takes you to watch a TV show.”
  • Continued bombarding until running out of ammo June 11, at which point it went back for resupply. By the time it was back, allied troops had driven the enemy so far inland its guns couldn’t reach. So it moved in to 3,000 yards, the closest it could get without beaching the ship.
  • “It’s at that point the Texas said ‘Hold my beer’ and flooded all the blister tanks on the starboard side, tilting the entire boat, changing the angle of the guns, allowing them to reach further inland.”
  • “They gangster leaned a 32 thousand ton warship so they could continue to engage the enemy. This might be the most grunterrific moment in world history.”
  • “It’s not technically a war crime. Geneva didn’t even necessarily know that shit was a fucking option.”
  • The Texas would go on to fight at Okinawa and Iwo Jima.
  • More info at https://battleshiptexas.org/.

    Lawrence Person’s BattleSwarm Blog

    True dat

    https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj6P4fZeQ-ErUMLngDE8R37Vevq7jqsTEzgAymMbDY8fLneS66tKW18bZZqNm0kx_mHXxkt1FbIfrOU2faZVVTJxe3st1mkU-2eOpMVQj8OQU1pyZGXa0E0XSXjTzzFc66iarqx7PSQb2GuBFmspDFEyFrS7S2LEatAYNAAxG8qMvz0CVMaOk6_lfE16Fw/w400-h356/Thug%20culture%20problem.png

     

    Found on social media:

    True dat.

    Peter

    Bayou Renaissance Man

    Laravel – Eager loading can be bad!

    Laravel – Eager loading can be bad!

    January 28, 2024

    Hello ????




    cover el

    Yes, you read it right. Eager loading can be bad, really bad. However, we often resort to it when dealing with an N+1 scenario, thinking that we’ve resolved the issue, when in fact, we might have made it worse. How? Let’s see.

    How bad it gets

    For this demo, we are building Laravel Forge. Like (almost) every Laravel application, we will have a One To Many relationship.

    We aim to log every activity for a server. A log can include the activity type, the user who initiated it, and other useful information for later analysis.

    <?php
    namespace App\Models;
    use Illuminate\Database\Eloquent\Model;
    use Illuminate\Database\Eloquent\Relations\HasMany;
    class Server extends Model
    {
    public function logs(): HasMany
    {
    return $this->hasMany(Log::class);
    }
    }

    Now, in the application, we want to list all the servers. So, we might do something like

    
    
    <table>
        <tr>
            <th>Name</th>
        </tr>
        @foreach ($servers as $server)
        <tr>
            <td></td>
        </tr>
        @endforeach
    </table>

    Moving forward, we have 10 servers, and each of them has 1000 logs.

    So far, so good. Now, we want to display when the last activity on a server occurred

    <table>
        <tr>
            <th>Name</th>
            <th>Last Activity</th>
        </tr>
        @foreach ($servers as $server)
        <tr>
            <td></td>
            <td>
                
            </td>
        </tr>
        @endforeach
    </table>

    Basic things, we access the logs() relation, ordering it to retrieve the latest record, getting the created_at column, and formatting it for better readability using diffForHumans(). The latter yields something like "1 week ago".

    But this is bad, we’ve introduced an N+1 problem.

    If you don’t know what a N+1 is, we are running the following queries

    
    select * from `servers`
    
    
    select * from `logs` where `logs`.`server_id` = 1 and `logs`.`server_id` is not null order by `created_at` desc limit 1
    select * from `logs` where `logs`.`server_id` = 2 and `logs`.`server_id` is not null order by `created_at` desc limit 1
    
    select * from `logs` where `logs`.`server_id` = 10 and `logs`.`server_id` is not null order by `created_at` desc limit 1

    To resolve this issue, we typically reach out to Eager Loading (I know you did).

    
    $servers = Server::query()
        ->with('logs')
        ->get();
    
    
    <table>
        <tr>
            <th>Name</th>
            <th>Last Activity</th>
        </tr>
        @foreach ($servers as $server)
        <tr>
            <td>{{ $server->name }}</td>
            <td>
                {{ $server->logs->sortByDesc('created_at')->first()->created_at->diffForHumans() }}
            </td>
        </tr>
        @endforeach
    </table>

    With this update, we manage to reduce it to only 2 queries

    
    select * from `servers`
    
    
    select * from `logs` where `logs`.`server_id` in (1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

    And it looks like we addressed the problem, right?

    Wrong! We’re only considering the number of queries. Let’s examine the memory usage and the count of loaded models; these factors are equally important.

    • Before eager loading
      • 11 queries: 1 to retrieve all servers and 10 queries for each server.
      • A total of 20 models loaded.
      • Memory usage: 2MB.
      • Execution time: 38.19 ms.




    before

    • After eager loading
      • 2 queries: 1 to get all servers and 1 to get all logs.
      • A total of 10010 models loaded ????.
      • Memory usage: 13MB (6.5x increase).
      • Execution time: 66.5 ms (1.7x increase).
      • Slower computational time due to loading all the models ????.




    after

    The tool in the screenshot is Debugbar.

    Looks like we didn’t fix anything; in fact, we made it worse.. And keep in mind, this is a very simplified example. In a real world scenario, you can easily end up with hundreds or thousands of records, leading to the loading of millions of models.. The title makes sense now?

    How do we truly solve this?

    In our case, eager loading is a NO NO. Instead, we can use sub-queries and leverage the database to perform tasks it is built and optimized for.

    $servers = Server::query()
        ->addSelect([
            'last_activity' => Log::select('created_at')
                ->whereColumn('server_id', 'servers.id')
                ->latest()
                ->take(1)
        ])
        ->get();

    This will result in a single query

    select `servers`.*, (
            select `created_at`
            from `logs`
            where
                `server_id` = `servers`.`id`
            order by `created_at` desc
            limit 1
        ) as `last_activity`
    from `servers`

    Since the column we need from the relationship is now computed in a subquery, we have the best of both worlds: only 10 models loaded and minimal memory usage.

    You might be thinking that with this approach comes a drawback: the last_activity column is now a regular string. So, if you want to use the diffForHumans() method, you’ll encounter the Call to a member function diffForHumans() on string error. But no worries, you haven’t lost the casting; it’s as simple as adding a single line.

    $servers = Server::query()
        ->addSelect([
            'last_activity' => Log::select('created_at')
                ->whereColumn('server_id', 'servers.id')
                ->latest()
                ->take(1)
        ])
        ->withCasts(['last_activity' => 'datetime']) 
        ->get();

    By chaining the withCasts() method, you can now treat the last_activity as if it were a date.

    How about the Laravel way?

    The reddit community never disappoints! They have pointed out another alternative solution, a Laravel-ish approach; One Of Many.

    Let’s define a new relationship to always retrieve the latest log

    
    public function latestLog(): HasOne
    {
        return $this->hasOne(Log::class)->latestOfMany();
    }
    

    Now we can use the relationship like this

    
    $servers = Server::query()
        ->with('latestLog')
        ->get();
    

    This will result in the following queries

    select * from `servers`
    
    select `logs`.*
    from
        `logs`
        inner join (
            select MAX(`logs`.`id`) as `id_aggregate`, `logs`.`server_id`
            from `logs`
            where
                `logs`.`server_id` in (1, 2, 3, 4, 5, 6, 7, 8, 9, 10)
            group by
                `logs`.`server_id`
        ) as `latestOfMany` 
        on `latestOfMany`.`id_aggregate` = `logs`.`id`
        and `latestOfMany`.`server_id` = `logs`.`server_id`

    And it can be used in the Blade like this

    
    @foreach ($servers as $server)
        {{$server->latestLog }}
    @endforeach

    For a comparison between the two methods:

    • Using subqueries
      • 1 query.
      • A total of 10 models loaded.
      • Memory usage: 2MB.
      • Execution time: 21.55 ms.




    old

    • Using the latestOfMany()
      • 2 queries
      • A total of 20 models loaded.
      • Memory usage: 2MB.
      • Execution time: 20.63 ms




    new

    Both methods are really good; which one to use will depend on your case. If you absolutely need the child model hydrated and will make use of all its fields, go with the latestOfMany(). However, if you only need a few fields, then the subquery will perform better. This is because, in the subquery, you select exactly what you need. Regardless of the number of records you have, the memory usage will be almost the same. Now, for the second method, memory usage is heavily dependent on the number of columns your table has. In reality, a table can easily have 50 columns, so hydrating the model will be expensive, even if it is only one per parent, that is to keep in mind when choosing!

    Conclusion

    I have seen some developers, by design, choose to force eager loading for all the models. You can’t just use it for everything, as much as it seems like you’ve solved the issue, you might have actually created a worse one. Not everything is a nail; the hammer might not work ????


    Laravel News Links