Network Request Recording, Javascript Error Logging and Deep Stack Traces!

Inspectlet now automatically records all XHR network requests on your site, monitors and collects all javascript errors outfitted with deep stack traces, full network request details, response data (when enabled), and request latency.

Track and monitor all javascript errors

Capture all javascript errors and console logs automatically. Identify new errors as soon as users see them.

Javascript Error Logging

View all network requests at a glance

Network Request Logging

Full session playback for the ultimate context

See exactly how the error happened in the first place by watching a user’s entire visit with session replay.

If you have any questions or feedback about this feature, please send us a note at

Network Request Recording, Javascript Error Logging and Deep Stack Traces!

New: Run A/B Tests on Your Website with Inspectlet!

We’re excited to announce that you can now run A/B tests in Inspectlet! Create experiments using the visual editor to make changes to your original webpage and test the impact of those changes on critical performance metrics like revenue, signup rate, etc.

A/B testing (also known as split testing) is a controlled experiment that compares the performance of two different versions of a webpage by showing half your visitors one version of your site, and the other half the second version. By comparing the performance (measured using Goals like signup rate or revenue) of each version of the webpage, we can empirically determine which version performs best and quantify the improvement rate.

You can run unlimited experiments with your users, to get started simply head to the Dashboard and click A/B Experiments to create your first experiment. For a complete walkthrough of our new A/B testing offering read the full documentation here.

As always we would love to hear any thoughts and feedback at

New: Run A/B Tests on Your Website with Inspectlet!

Real-time Notifications via Email, Slack, Webhooks on Actions, Events and New Sessions

You can now receive real-time notifications via email, slack, or a webhook to a custom URL upon any particular action being done by a visitor visiting your site (i.e. an event like a purchase), or simply for a new session being recorded in Inspectlet:

Notifications Settings

We hope this will be useful for being notified on important events like:

  • Real-time notifications for purchases
  • Tracking billing failures and error occurrences
  • Being notified when new prospective customers visit a particular page qualifying the lead for a sales follow up

and much more!

If you have any questions about how to use notifications, please drop us a line at

Real-time Notifications via Email, Slack, Webhooks on Actions, Events and New Sessions

New: Write notes for your sessions from the sidebar

We’re happy to share that you can now annotate your session recordings with notes! Now you can quickly jot down any thoughts/comments about your users as you watch them use your site:

Write notes for your sessions.

Any notes you write down can be searched for in the session filters to retrieve sessions that were interesting. See someone having trouble on a page or getting confused? Now you can leave a label like “confusion, wrong page” and search for all sessions exhibiting “confusion” later.

Search for sessions by notes.

The new playback sidebar also lets you directly play other sessions from the same user, view/edit tags, and more. It’s available to all users immediately, check it out!

With much love,
The Inspectlet Team

New: Write notes for your sessions from the sidebar

Increased Data Retention Across the Board – Introducing our New Data Retention Policy!

We’re excited to announce Inspectlet’s new data retention policy! As of today, Inspectlet stores all session recordings and heatmap data for one month up to a year depending on your plan:

Data retention per plan

The new data retention policy is in effect immediately and gives you the ability to hold on to all your data for significantly longer, compare user behavior trends from different time periods, and observe the effects of long-term changes and improvements to your site.

We hope our new long-term retention policy is helpful! If you have any questions or feedback please feel free to reach out at

Click here for more information on data retention per plan.

Increased Data Retention Across the Board – Introducing our New Data Retention Policy!

Introducing the Inspectlet API

We’re incredibly excited to announce the release of the Inspectlet API!

The Inspectlet API lets you retrieve and search your Inspectlet data via a JSON-based REST interface.

All requests to the API are signed with HTTP Basic Authentication, you just need an API token to get started. To get your API token, please login to your Inspectlet account and look under “API Credentials” on the Your Account page.

To learn more and get started, check out the full API documentation.

We’d love to hear about how you’re using the API! If you have any questions or comments, please don’t hesitate to get in touch with us at

Introducing the Inspectlet API

Case Study with Nathan: Conversions Skyrocket 250-300% using Inspectlet

We interviewed Nathan Lippi, a software consultant who used Inspectlet at Scribblar and other startups so that he could tell his story of how Inspectlet made tangible differences to the user experience at each one of the companies at which he worked.

Issue 1 - Streamlining the process

I’d usually go on a burst of watching videos for a day or two and I‘d spot enough things to give me enough to work on for a couple weeks … the results were immediate. There was a 250-300% increase in signups.

Click here to read the complete case study about Nathan’s optimization efforts and how Inspectlet can provide valuable insights.

Case Study with Nathan: Conversions Skyrocket 250-300% using Inspectlet

3 Places You Are Letting Personal Opinion Hinder Lead Generation

Mistakes happen. You’ve made mistakes; I’ve made mistakes; your boss has made mistakes; your friends have made mistakes; your parents have made mistakes (don’t take that last one literally). Mistakes are one thing that just about everyone has in common.

For marketers, one of the most frequent places mistakes are made is in their lead generation strategy. Lead generation requires provoking interest from consumers by triggering some sort of emotional response. Being that marketers are also human, it’s only natural that we fall back on our own emotions and biases to infer what those around us are like. And that right there is the biggest mistake a marketer can make.

To help identify where you may be letting your personal opinion hinder your lead generation performance, I’m going to cover three mistakes marketers make that are a detriment to their lead generation potential.

1.) Making Assumptions About Your Audience

Assumptions are a high risk, high reward decision, but one that I would not suggest making. Let’s say you are a Facebook advertising expert with an educational blog. For your first piece, you want to jump right in and cover Facebook Ad reporting. You realize you are skipping over a lesson on how to setup a Facebook Ad campaign in the first place, but you think your audience already knows that by now.

Best case scenario, you were right about your audience––most of them do in fact already know how to set up a Facebook ad campaign and are ready for your video on how to create reports based off their results You didn’t even have to take the time to create an introductory tutorial on how to set up a Facebook ad and your audience didn’t seem to mind. Nice!

But, worst case–– when you say “You guys are already familiar with how to set up a Facebook ad campaign” in your ad or video, you could be making a false assumption that leaves your audience thinking three things:

  • I must be stupid for not knowing this
  • This website is intended for more experienced marketers
  • I should leave

A lead should never be lost that way. You can always avoid making assumptions––whether that means additional research or content creation, it’s extra work that marketers sometimes try to skimp over.

Let’s real talk for a minute. If your business wants to stop making assumptions, keep this in mind:

  • Your audience may not like you
  • Your audience may not like the same things as you
  • Your audience might not really care about your brand
  • Your audience might not always understand you
  • Your audience might not understand industry jargon

Now, to clarify––assumptions are not always bad. Most great ideas start with an assumption. What’s going to make-or-break your lead generation strategy’s success though is your ability to test those assumptions.

Today, we have more advanced marketing analytics software available than ever before for testing assumptions. Inspectlet’s session recording feature is one new-age analytics software that is helpful for testing assumptions by showing you exactly what visitors are doing on your site by tracking their mouse’s movements, clicks, scrolls and keypresses. Its screen capture feature allows you to record individual visitor sessions and then play back that session later to see how visitors are navigating your site.

filter user sessions

Let’s say you are given the task of increasing the average amount of time visitors spend on your site. Continuing from the previous example, you make the assumption that having more videos on your site catered towards Facebook ad beginners will improve the average amount of time spent on your site. The session recording feature shows you that a high number of visitors are pausing your “Facebook Ads Report” video after a few seconds and then scrolling up and down your page for a bit before exiting.

This could be a sign that visitors are looking for a more introductory video; therefore supporting (but not quite yet confirming) your assumption. Just like that, you’ve avoided confusing or scaring a visitor away, while identifying a potential solution for generating more educated and profitable leads that you can continue testing.

Another example of where this screen capture software is helpful is illustrated below. The user is on a mobile phone and gets to a page with a call-to-action on it, but is unable to scroll down. Most people’s first move after seeing this page was not converting well would be to split-test different CTA’s and copy, when all along it was just an easily fixable, technical error.

mobile session playback

2.) Underestimating Landing Page Elements

While it’s important to stay on top of industry best practices, these should not be the only thing dictating the elements your business split tests. Basing which elements to test off of recommended best practices or your own personal opinion will lead to you underestimating the impact of [what you consider to be] minor elements of your landing page and ultimately limit your lead generation potential.

Let’s take a look at an example. The split-test below shows two variations of a landing page opt-in form. The only difference is that Version A includes a security badge and Version B does not. Can you guess which one yielded a higher conversion rate?

testing two versions of the same form

You would assume that having a security badge, which is intended to reassure customers of your business’s legitimacy, couldn’t possibly hurt your conversion rate, right? Wrong. Version B actually outperformed Version A by 12.6%. To reiterate what was said in the previous section, this is just one example of why you need to be testing your assumptions.

The problem with the security badge in Version A is that images like that are commonly associated with payments. Visitors filling out your opt-in form are at the very early stages of their customer journey and it is likely that this is the first time they are interacting with your brand. Putting the image there causes potential leads to subconsciously associate it with you asking for their money and are scared away. This might seem silly to the more technologically inclined, but––here’s that keyword again–– don’t assume your audience thinks the way you do.

Without running split-tests, it’s unlikely that anyone would have identified the security badge as the element hurting conversion rates. That is not a commonly tested landing page element and it’s impact would have otherwise remained underestimated. It’s for this reason that your business should be split-testing every element of your page, whether you think it will impact performance or not, and pay close attention to the results. I promise you will be surprised with your findings.

3.) Neglecting To Learn From Mistakes

Not every lead generation tactic your business tries out is going to work the way you had hoped. The sooner you accept that, the sooner you can learn from your mistakes and avoid doing the same thing in the future.

It’s for this reason that it has become common practice for businesses to run A/B tests. This is as opposed to just sweeping an entire page and starting from scratch again because something about that page wasn’t working. A/B tests allow marketers to incrementally test each element on their landing page to see which could be negatively impacting performance. Although these tests provide data that show what is and is not working, what they don’t tell you is the bigger picture on why that particular element is impacting performance the way it is.

two different button colors

As a basic example, let’s say you run an A/B test and find that your red call-to-action button is outperforming your green one. Aside from ensuring you change the standard CTA on that page to be red, it’s important that you examine what it is about a red button that is making it perform better. Some research on the psychology of color would reveal that red is commonly associated with urgency and triggers customers to act faster on an offer. By knowing this, you are able to then start implementing the color red into other facets of your landing page and further enhancing lead generation rates.

Just as important as understanding why red is positively impacting your page’s performance, is understanding why green is not. Rather than adapting a one-and-done mindset when testing different lead generation tactics, your business should be taking a more holistic and objective look at the successes and failures of your lead generation strategy to optimize the customer insights you are able to take away from each element.

(Guest Post by Ryan Lynch of Yazamo)

Interested in learning more about effective methods for turning visitors into leads? Check out Ryan’s free eBook, “Turning Visitors Into Leads”

3 Places You Are Letting Personal Opinion Hinder Lead Generation

Guide: How to Install Inspectlet with WordPress

WordPress is a fantastic blogging platform! It is easy to setup, has an extensive ecosystem, and is just generally a solid piece of software. We get questions all the time about best practices with setting up Inspectlet and WordPress, so we figured we would write a guide for you to use. :)

Step 1: Login into your WordPress Dashboard, Click on Plugins

Step 2: Search Plugins for “Inspectlet”

…It should turn up this plugin you see below .

Step 3: Download and activate on the Plugin(see below)! You will need an account with Inspectlet to get started. You can create a free account here

Step 4: Relax. You’re done!

Guide: How to Install Inspectlet with WordPress

Brief Guide to Iterating Effectively with A/B Testing

A/B testing on your site lets you quickly test different versions of your site to find which variation of your site sits best with your visitors. This guide will detail how to get started with A/B testing on your site.

A note on statistical significance

In order to get accurate and reliable results from A/B testing, you’ll need a certain number of visits and successful conversions per variation before a true winner can be declared. There’s plenty of literature out there on the topic, here’s a calculator to give you an idea. If your site is small, be prepared to wait for some time to get accurate results.

Typically when starting out it’s a good idea to run the Null Hypothesis Experiment. This is a quick and easy way to make sure that there are no unaccounted variables in your testing platform that could spoil your data. Create an experiment with only 1 variation: nothing. The purpose of this experiment is to have a variation that’s the exact same page as the control page to make sure that the result is 0% improvement. If the result of an experiment with no change is not 0%, there is something wrong with your testing platform!

Figure out what’s worth testing

Come up with some questions you want to answer with the help of solid data. For example:

  • Would a value-proposition based headline lead to more conversions compared to a concise explanation of my product?
  • Would including my prices up front help anchor my prospective clients’ expectations or scare them away before I have a chance to make my case?
  • Should I move this big paragraph about returning merchandise on the product page to the bottom, giving more attention to our free shipping policy?

Inspectlet’s user session recording tool can be handy here, allowing you to record and watch your real visitors. Watching a few people use your site can help you find areas causing confusion and allow you to witness unexpected behavior that you want to correct.

Give it a go!

It’s time to run our experiments! We like to use Optimizely at Inspectlet, but you can use any tool you fancy. When you’re creating the experiments, keep in mind that more variations per experiment means you’ll have to wait longer before reaching statistical significance. Set up some goals that align with your business metrics to track how well each variation performs.

Iterate based on results

split testing results

After some time you should have empirical data on each question you wanted to answer. You can also use Inspectlet to watch user sessions of visitors experiencing different variations of your site.

Watching user sessions can not only help you understand why a specific variation performed the way it did in the numbers, but you’ll also witness any other changes to the user experience caused by the variation.

Rinse, repeat

Congratulations on running an experiment! Once you’ve found an improvement over the baseline, you can now push that change to all your visitors with confidence.

Brief Guide to Iterating Effectively with A/B Testing