Profiling browser requests with Blackfire

English
, ,

This week’s task was optimising Magento for large carts (100+ different products) and a profiler is a good tool to find small pieces of code that could be optimised. Blackfire was my choice, mostly because previous experience, but also because it is not a resource hog like Xdebug, which was taking around 6 minutes while Blackfire took only 20 seconds in the same type of request.

Companion

Blackfire works great to profile GET requests on the browser with the help of Companion, but it doesn’t allow you to profile POST requests. Also, Companion sends multiple requests to the same endpoint, but Magento’s place order is not stateless, so the first and all other request would differ substantially.

CLI tool

Second attempt was the Blackfire CLI tool, which allows you to profile anything you can reach with curl, including POST requests and custom headers. But, you can only profile one request at the given time, so that means I need to assemble one HTTP call that does everything: login customer, add products to cart, collect totals, get shipment information, get payment information, save shipping, save payment, and finally place order. That way I am profiling a lot more than I need, not to mention that everything will be in the same requests and not have the same performance as the end customer will experience, since a lot of objects will have been cached in memory when the place order part is executed. So I ruled that out too.

Xdebug

Xdebug does exactly what I want, but takes too much time, being almost impractical. You can enable the profiling for any HTTP request, so that wouldn’t be a problem, I just use my browser and to navigate through all the needed steps (I can even do that with Xdebug disabled and enable only for the last step) and then analyse the cachegrind files. But Xdebug uses too much resources, I was hitting a 5-minute timeout on FPM too easily and the results were contaminated by the slowness (everything was slower in a factor of ~5 compared to the latest Blackfire result).

One nice tip here: you can use blackfire upload to send the Xdebug output files to Blackfire and see the analysis there, I am pretty amazed by that feature. Way to go SensioLabs!

Custom headers

With a bit of back and forth with SensioLabs support I was able to achieve what I wanted, even if requires a bit more work.

Requirements

  1. Extension for your browser to modify the request headers (I used ModHeader for Google Chrome);
  2. Blackfire CLI tool;
  3. Blackfire configured on the webserver;

Steps

  1. Do all the preparation needed. I added all products to the cart, navigated to checkout page and filled all the information, only leaving out the ‘click on place order button’ step;
  2. Run blackfire run env and copy the value of the BLACKFIRE_QUERY outputted variable;
  3. Using the browser extension, add the following headers to your request (example):

    1
    2
    X-Blackfire-User-Agent: Blackfire Companion - Chrome/58.0.3029.110 Extension/1.10.0
    X-Blackfire-Query: [the value copied on #2]
  4. Execute your request (click the place order button).
  5. Profit.

With those steps, the first AJAX/POST request (that reaches PHP) with those headers will be profiled. It only works once, so it will not profile the success page (which is the one the request I profiled redirects to). Every new request needs a new X-Blackfire-Query header, so with this approach is not (yet) possible to profile the redirected page as well or other subsequent calls inside the same page.

If the request you want to profile is not the first one that you are able to fire, instead of using the extension on the browser, change your Javascript code to include those headers only for the request you need (or keep the extension but change the Javascript to remove them).

What I have not tried

There are two other ways that I could profile what I wanted with Blackfire, but I haven’t tested:

  1. Use the PHP SDK and start/stop the profiling on the areas that you need. It sounds like is something that will work, but that means you need to do some changes on code. I would only go that route if I hadn’t found any alternative.

  2. Using the scenario I described on the CLI tool, but using the technic presented on Profiling HTTP Sub-Requests using Blackfire. To achieve I would had to assemble all the HTTP calls as the same way as Magento would have done, and send the X-Blackfire-Query header with them. This will profile all the calls individually (thus not having the in-memory cache problem), but it also requires a bit of coding, or preparing all the HTTP requests manually to some degree.

Conclusion

Blackfire is a nice tool and I really like the help their support provided here. I wish they cover this edge case in their roadmap, but after this discovery I am definitely not turning it down, I am very happy with the results. Please consider giving Blackfire a try if you need it.