The benefits of dynamic rendering for SEO

JavaScript frameworks have been gaining in popularity in recent years, thanks in large part to the flexibility they offer. “JavaScript frameworks enable rapid development. Offers a better user experience. It offers better performance and offers enhanced functionality that traditional, non-JavaScript frameworks lack,” said Nati Elimelech, Tech SEO Lead at Wix.

“So it’s no surprise that very large websites or complex user interfaces with complex logic and functionality generally tend to use JavaScript frameworks today,” he added.

At SMX Next, Elimelech provided an overview of how JavaScript works for client-side and dynamic rendering, and shared auditing insights gained from rendering JavaScript on over 200 million websites.

Client-side versus server-side rendering

Different rendering methods are suitable for different purposes. Elimelech advocated dynamic rendering as a means to satisfy both search engine bots and users, but first, you need to understand how client-side and server-side rendering work.

Client-side rendering

When a user clicks on a link, their browser sends requests to the server on which the site is hosted.

“When we talk about JavaScript frameworks, that server responds with something that’s a little bit different than what we’re used to,” Elimelech said.

“Respond with skeleton HTML, just basic HTML, but with lots of JavaScript. Basically what it does is tell my browser to run JavaScript to get all the important HTML,” he said, adding that the user’s browser then outputs the rendered HTML (the final HTML used to build the page the way it is). what we do). really see it). This process is known as client-side rendering.

A slide with a description of client-side rendering.
Image: Natie Elimelech.

“It’s a lot like building your own furniture because basically the server tells the browser, ‘Hey, these are all the pieces, these are the instructions, build the page. I trust you.’ And that means all the hard work is moved to the browser instead of the server.” Elimelech said.

Client-side rendering can be great for users, but there are cases where a client doesn’t run JavaScript, which means you won’t get the full content of your page. An example of this may be search engine crawlers; Although Googlebot can now see more of your content than ever before, there are still limitations.

Server-side rendering

For clients that don’t run JavaScript, server-side rendering can be used.

“Server-side rendering is when all that JavaScript is executed on the server side. All resources are required on the server side and your browser and search engine bot don’t need to execute JavaScript to get the full HTML,” Elimelech explained. This means that server-side rendering can be faster and less resource intensive for browsers.

A slide with a basic explanation of server-side rendering.
Image: Natie Elimelech.

“Server-side rendering is like providing your guests with an actual chair that they can sit on instead of having to put it together,” he said, continuing his earlier analogy. “And, when you do server-side rendering, you basically make your HTML visible to all kinds of bots, all kinds of clients. . . No matter what the JavaScript capabilities are, you can see the important final rendered HTML,” she added.

dynamic rendering

Dynamic rendering represents “the best of both worlds,” Elimelech said. Dynamic rendering means “switching between client-side rendered content and pre-rendered content for specific user agents,” according to Google.

Below is a simplified diagram explaining how dynamic rendering works for different user agents (users and bots).

A flowchart describing dynamic rendering.
Image: Natie Elimelech.

So there is a request to the URL, but this time we check: Do we know this user agent? Is this a known bot? Is it Google? Is it Bin? Is it Semrush? Is it something we know? If not, we assume it’s a user and then do client-side rendering,” Elimelech said.

In that case, the user’s browser executes the JavaScript to get the rendered HTML, but still benefits from the benefits of client-side rendering, which often includes a perceived increase in speed.

On the other hand, if the client is a bot, server-side rendering is used to serve the fully rendered HTML. “So, you see everything that needs to be seen,” Elimelech said.

This represents “the best of both worlds” because site owners can still serve their content regardless of the client’s JavaScript capabilities. And, because there are two streams, site owners can optimize each to better serve users or bots without impacting the other.

But dynamic rendering is not perfect

However, there are complications associated with dynamic rendering. “We have two streams to maintain, two sets of logic, caching, other complex systems; so it’s more complex when you have two systems instead of one,” Elimelech said, noting that site owners should also maintain a list of user agents to identify bots.

The pros and cons of dynamic rendering
Image: Natie Elimelech.

Some may worry that serving search engine bots something different than what you’re showing users could be considered cloaking.

“Dynamic rendering is actually a preferred and recommended solution by Google because what Google cares about is whether the important things are the same [between the two versions]”, Elimelech said, adding that “the ‘important things’ are the things that matter to us as SEO: the content, the headers, the meta tags, the internal links, the navigation links, the bots, the title, the canonical markup of structured data. , content, images – anything to do with how a bot would react to the page. . . it is important to maintain identity and when they are kept identical, especially the content and meta tags, Google has no problem with that”.

Potential site parity issues when using different JavaScript rendering methods
Image: Natie Elimelech.

Since it is necessary to maintain parity between what is serving the bots and what is serving the users, it is also necessary to audit issues that could break that parity.

To audit possible problems, Elimelech recommends Screaming Frog or a similar tool that allows you to compare two crawls. “So what we like to do is crawl a website like Googlebot (or other search engine user agent) and crawl it as a user and make sure there’s no difference,” he said. Comparing the appropriate items between the two crawls can help you identify potential problems.

A slide with tools to audit the javascript versions of your site.
Image: Natie Elimelech.

Elimelech also mentioned the following methods to detect problems:

“Remember, JavaScript frameworks aren’t going anywhere,” he said. “You’ll most likely run into one of them soon, so you’d better be prepared to handle them.”

See the full SMX Next presentation here (free registration required).


New to Search Engine Land

About the Author

George Nguyen is an editor for Search Engine Land, covering organic and paid search. His background is in journalism and content marketing. Before entering the industry, he worked as a radio host, writer, podcast host, and public school teacher.