Overlapping Cache TTLs

Hoisted myself with my own petard last night. Without getting into too much detail, at ib we acquire contextual data from the Palm Beach Post, add some elections data, and serve it to the public.

Not wanting the contextual data acquisition process to be too expensive for the Elections Engine, I scheduled it to occur once an hour.

Not wanting the contextual data delivery process to be too expensive for the Delivery Network, I also cached the results of the process for an hour.

The acquisition and caching processes run asynchronously – which leads to unpredictable TTLs for the resulting content.

Long story short, PBP was updating their context rapidly, and sometimes the processes lined up just right and their updates appeared in minutes, and other times the processes lined up just wrong and the didn’t see their contextual changes for hours.

There must be some anti-pattern that describes this async cache strategy: I should try to find it online, if only to warn others of the dangers of over-zealous layered caching.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s