CASE STUDY – how Greenstories improved site resilience with the help of Tideways
Welcoming bots and crawlers to your site is a necessary part of operating any web application or e-commerce store. That relationship turns sour when they engage in excessive behavior and adversely affect the performance for regular users.
Our customer Greenstories experienced a 10-fold slowdown in performance and a failure rate of 6% over a few hours when a crawler visited their store and messed with the Shopware cart functionality. Overall, the response time of the application decreased from 835 milliseconds to 8.57 seconds and the failure rate increased from 0.6% to 6.1%. The decrease affected all kinds of page types equally, like product and category details, and actions that users can perform like cart changes.
Greenstories used a combination of Tideways feature to tackle the problem:
- Real-time monitoring & alerts: individually configured Tideways alerts provided Greenstories with the crucial starting point for investigation and optimization.
- Precise code-level profiling: Using the differential flame graphs feature in Tideways, Greenstories was able to pinpoint the exact part of the code causing the slowdown (
INSERT cart), saving valuable time during root cause analysis. - Crawler detection insights: Tideways identified the unusual traffic pattern as bot-related, allowing Greenstories to take targeted action against crawler-induced load and prevent future disruptions.
Let’s have a look at in in detail:

The most impactful action performed during this period is listed at the top with the technical name CartLineItemsController::addLineItems, representing the code that is run when a user adds an item to their cart.

To find out what causes this performance slowdown, the next step is to investigate the profiling data of individual code statements, that Tideways continuously collects from the application, the before and after comparison this is a mere click away.
A comparison of the code level performance changes is done using a visualization called a differential flame graph. It colors each code statement based on how much faster or slower it got. Look out for the dark red parts that mark code that got over 300% slower.

The comparison highlights that a code operation on the “SQL” database layer titled INSERT cart is responsible for the majority of the slowdown.
With these insights, the problem could be narrowed down step by step:
- Tideways shows the Shopware function
CartPersister::saveseveral times every request, which then stores the contents of the cart in the database with the SQLINSERT cartoperation. Why did it happen multiple times and not just once? - An analysis of the Shopware code-case revealed that this is due to the cart rules failing based on the configured validation rules.
- Because no changes were made to cart rules during this timeframe, a conclusion was that this is attributable to bots that exhibit a very different behavior than regular users.
- More recently this could be verified with Tideways crawler detection feature. Traces with this performance behavior were detected as requested by “Crawler”.
This provided actionable information for Greenstories that it’s necessary to better prevent bots from ransacking the site.
Common solutions for this are:
- using a web application firewall (WAF)
- using rate limiting for user-agents / IP addresses
- detecting bots in PHP code by writing a Shopware plugin in combination with bot-detection libraries installed via Composer
But the root cause investigation also highlighted that on the code level, Shopware could provide a better behavior for the case of storing carts with errors.
Therefore, we filed an issue for that purpose which was quickly fixed by Shopware for a next version. So this performance drop ultimately led to a solution with benefits for all Shopware stores.
tl:dr
Here is a quick-read presentation of the Case Study showing how Tideways was able to help our customer Greenstories to improve site performance: