The Idea
This is the main corporate website for Keap. It’s the main engine behind marketing campaigns, lead capture, education and just general information about Keap.
What it Does
The site does a lot. Its main role is to capture leads using custom forms and interactive elements. It also hosts the company blog that gets fed from a headless Craft CMS instance.
Under the Hood
The Backend
The backend is built fully on Laravel. Laravel has a great templating engine, blade, that allows us to quickly create reusable components and have a very fast turn around time on new projects. It also allows us to build out maintainable but complex business logic on the backend to handle any need that might come up. Since Laravel also has a great caching facade we can use Redis to improve our page loading times whenever we need to read data from a database or API.
The Frontend
For most of the site we built the frontend out using Laravel’s Blade templating engine. For the blog it’s built on top of a headless Craft CMS instance that hydrates a Blade template with the blog data. For the interactive elements on the page we use a mixture of Alpine.js, Vue.js and Vanilla Javascript. If there’s any kind of animation that gets more complex than a simple CSS animation we use Motion Framer (FKA Motion One). As for the styles we use Tailwind for any new pages created but a good chunk of the site uses a legacy Foundation CSS library.
Biggest Challenge
With every project comes challenges, this one is no different. One of the biggest challenges of this project was migrating from an old Drupal headless blog to a new CraftCMS headless blog. The site had been running for years on Drupal and had a lot of custom logic written around serving up the blogs and allowing editors to write content. I was tasked with finding and implementing a new CMS to migrate to. I chose CraftCMS due to its modern coding standards, ease of expanding and its ease of use from an editor standpoint.
The first task was to handle the data migration from Drupal to Craft. For this I spun up a small node server that would act as a middle man to transform and validate the data before it was imported into Craft. This involved remapping URLs, Categories, Users, assets, etc and serving it all up in easy to access REST endpoints that Craft could hit.
With that done the next task was connecting the GraphQL API to Keap’s caching system. Keap’s site stores all the blog posts in Redis to allow for easy fetching and quick load times. After spending a while going through all the connection code written for Drupal, I decided the best approach was to start fresh. So I wrote a simple GraphQL PHP client, set up some cache “profiles” containing the queries, and I started working on an model pattern for Craft’s entry types. Each entry type is considered its own model and extends the base model which offers some basic caching and transforming functionality. Within each entry type model I was able to define custom caching / transform methods that were specific for that entry type and data set. After all that was set up I had to set up a method that would loop through the profiles and cache each of them. This was a perfect use case for a Laravel Job. By pushing it to the queue I was able to handle as many profiles as I wanted without effecting any site performance. This also allowed me to handle errors for individual profiles rather than trying to catch errors for all the profiles at once.
Now that I had caching out of the way I could move on to getting the data visible to the frontend. Luckily all the blade templates that were set up for the Drupal data were still able to be used. I just had to modify some of the variable names to make sure it had the right data. The hardest part was making sure Craft’s “preview” mode worked correctly. Craft handles previews by attaching a token to the URL. With that token I can pass it into my GraphQL request to get the current version of the post being edited or previewed. This of course had to bypass the caching system since I needed to display the edited data. Fortunately that was easy to do since the models could be used to ether fetch live data or cached data with a simple boolean passed in.
Ultimately after all the work had been done, the site saw huge improvements. Both from a system performance side as well as a editor experience side. For the system performance we saw recaching speeds go from 30-90 seconds to 7-13 seconds. Due to the restructuring of the cache I was able to get the server response time down by a little bit as well. From and editor experience perspective we saw an overall increase in the speed that people were able to write articles. We also saw much less typos or broken URLs due to improved UI and validation.
What’s up with it now?
Currently the Keap site is alive and well. It’s always changing and growing. Check it out for yourself: https://keap.com