A sandbox for the web
Finally, urlscan.io will query external services to determine whether a page is malicious. It will also try to detect some common malicious practices, such as CryptoJacking.
The original idea behind urlscan.io was to allow even inexperienced users to get a look at what a particular website is requesting in the background. Since it was started in December 2016, urlscan.io has become a widely-used tool for security professionals and amateurs to investigate possibly malicious pages, such as phishing attempts or pages impersonating known brands.
Q: How can I request the content of a scan to be removed from your website?
A: Please use the orange Report button on the result page of the scan.
Q: Do you use my browser or Internet connection to analyze a website?
A: No! urlscan.io will browse any website you request itself, your browser is not involved. The website you want to scan will never learn your IP address and you will not be at risk when looking at the results.
Q: How does urlscan.io work?
A: We use the Google Chrome browser in Headless Mode to browse to the URLs submitted by users. We record the interaction of the page with the Internet and after the page has finished loading, we annotate the results with additional data sources.
Q: Where do the scans originate from?
A: Right now we only have one static location in Germany that is used to browse websites. This means that some content might occasionally be blocked or that you might get different results than browsing from your own country.
Q: Does urlscan.io show whether a website contains malware or phishing attempts?
A: Yes, we have some basic mechanisms for determining whether a website contains malicious content. One service that we use is Google Safe Browsing. Ultimately however, many sites are not flagged as malicious even if they are, and there are sometimes false positives. Please use our scan results as a tool to inform your verdict and not as the final judge about whether a page is indeed malicious.
Q: Help! The scan for my website says that a number of ads were blocked but I don't have
any ads on my page!
A: We use a number of different sources for determining ad-blocking which vary in aggressiveness. We try to show as many potentially blocked resources as possible, but not every user using and ad-blocker will necessarily be blocking all of these. Also keep in mind that ad-blockers don't just block ads but tracking code in general. This would be Google Analytics et al.
Q: Do you store results indefinitely?
A: Yes, but right now we're not making a guarantee that the results of a scan will stay up for any period of time. When we hit certain limits we will have to start purging old scans.
Q: Do you support other browsers besides Google Chrome?
A: No, but you can set a custom User Agent during submission.
Q: Do you support IPv6?
A: Yes, and we're very happy about that because many similar services do not support it.
If you want to try a cool site, submit http://test-ipv6.com.
Q: Do private submissions deliver different results than public ones?
A: No, private submissions will deliver the same results as public ones. The only difference is that private submissions will not show up in the list of recently scanned sites and in the search results.
Q: How to do you handle mixed content on HTTPS websites?
A: We're not blocking mixed content by default. While this diverges from the way that modern browsers handle mixed content, it allows you to see all of the requests that could have been made.
Q: Do you offer different browser locations/countries?
A: Not right now, we might include this feature in the future. You will notice that our scanner right now connects from an ISP in Germany.
Q: Can I use your timing / latency data to monitor the performance of my page?
A: No, we include timing info just to get a rough sense of the relative latencies of resources. Please do not expect this service to deliver 100% accurate and consistent timing information.
Q: Can you include detailed performance analysis and improvement advice?
A: No, this is beyond the scope of this project. Best practices as far as web app performance is concerned are a complex and frequently-changing topic. There are already tools and services out there which focus solely on improving the performance of websites, from loading to rendering to interaction.
Q: Between different runs, websites often have a different number of HTTP transactions. Why is that?
A: The number of HTTP transactions depends on many factors:
- Time of day and actual content of the site
- Speed of the site (as we do have timeouts)
- Advertising embedded in the site
Q: Do you offer a commercial subscription or on-premise installation of urlscan.io?
A: No, we don't offer commercial plans right now. If you're interested in more powerful features or a dedicated instance of urlscan.io for your team, contact us at firstname.lastname@example.org.
urlscan.io is not the only service that can be used to browse and analyse a website. These are some similar services, some provided invaluable inspiration for this very service!
Lists of similar & related services
- Investigate & report phishing pages by SwiftOnSecurity
- Blocklists of Suspected Malicious IPs and URLs by Lenny Zeltser
- urlquery.net - Scans sites and looks up domains/IPs on various blacklists. This service inspired us to build urscan.io.
- URLVoid - Website Reputation Checker Tool
- keycdn speed test - Website speed test, employs similar techniques and inspired some features on this site
- WebPagetest - Exhaustive speed-testing service with different locations, browser and options
- pingdom Website speed test
- Calibre Web performance monitoring - Professional service for monitoring web app performance
- Trackography - Find out who is tracking you when you are reading your favourite news online.
- Web Cookies Scanner - HTTP cookies, Flash, HTML5 localStorage, sessionStorage, CANVAS, supercookies, evercookies as well as SSL/TLS and HTTP security
- Hardenize - Helping you deploy the latest security standards
- Browserless - A headless browser in the cloud
- Lighthouse - analyzes web apps and web pages, collecting modern performance metrics and insights on developer best practices.
- lightcrawler - Crawl a website and run it through Google lighthouse.
- Puppeteer - Headless Chrome Node API, maintained by the Google Chrome Team
- betwixt - System level network proxy, providing inspection via Network panel
- Awesome chrome-devtools - Awesome tooling and resources in the Chrome DevTools ecosystem
- The website logos are generated by the free Logo API from Clearbit.
For reasons of efficiency, these logos are delivered from our domain so we can proxy and cache them.
- The IP geo-location is courtesy of the MaxMind GeoIP Lite database.
- ASN information is thanks to Team Cymru's IP-to-ASN mapping service.
- We detect technologies on a website using the definitions from the Wappalyzer Project.
- URL badness is a result of querying the Google Safe Browsing API.
- We call a resource verified if it is part of a well-known set of resources served at cdnjs.com.
- We determine ad-blocking using the Easylist block lists.
- The country flags are part of the flag-icon-css library.
- The Bootstrap theme is called Flatly.
urlscan.io is not affiliated with any of the services we link to on our results pages. Linking to any site does not constitute an endorsement or guarantee of fitness of the data.