A validator is a great tool that checks your site for invisible defects. There are many of them, but Sonar is different from all others. It is the only testing tool that aims to validate websites comprehensively, is open source and community-driven, and has browser integration. Why do frontend developers need a comprehensive testing tool? Why is independence from any big software vendor important? And finally – does Sonar have potential and is it a useful tool now? Well, I have collected some significant notes on its quality.
What are validators good for?
I’ve been in web development for more than 17 years. I like to watch how the World Wide Web is changing over time. Some modern approaches become standardized while others fall into oblivion. I have encountered very good (HTML5 by Ian Hickson), very bad (RSS by Dave Winer), and very controversial (CSS Level 1 by W3 Consortium) specifications. Sometimes the specification is wrong, other times the implementation is incorrect. Validators are helpful in the process of maximizing web compatibility, but they can also make you 100% compatible with one software company at the price of breaking compatibility with dozens of other companies. There are basically two options. First, use as many validators as possible and iterate to minimize the number of errors, or second, use one universal vendor-independent validator that knows the best possible techniques for you.
I currently use these validators:
- Bing Markup Validator
- Bing Mobile Friendliness Testing Tool
- Bing SEO Analyzer
- Facebook Debugger
- Google Lighthouse
- Google PageSpeed Insights
- Google Mobile-Friendly Test
- Google Structured Data Testing Tool
- Nu Html Checker
- Qualys SSL Server Test
- Sonar
- Twitter Card Validator
- W3C CSS Validation Service
- W3C Markup Validation Service
- Yandex Structured Data Validator
Let’s look at Sonar
Both sonar and modern.IE are made by the IE/Edge dev team. Why was modern.IE discontinued and why does sonar not include validation of the Browser configuration schema?
I don’t think that a web page served with the Content-Type: application/xhtml+xml HTTP header must contain the HTTP charset parameter because the encoding of the XML document is UTF-8 by default.
I agree that the CSS style served with the Content-Type: text/css HTTP header should have the charset=utf-8 parameter, but why is this not already fixed in ASP.NET Core 2?
The recommendation about IE document modes is very funny. First, if the page is served as application/xhtml+xml, IE 9 and newer will use an XML parser and force the highest available standards mode. Second, even if I send the content type as text/html, I really don’t need to use the X-UA-Compatible header or meta tag to avoid compatibility mode because if the page contains the HTML5 Doctype, IE 6 and newer will automatically use the highest available standards mode (in the case of IE 6, only when the XML prolog is omitted). Sonar should check this first before it starts spreading alarm messages.
It is interesting that during Microsoft Edge Web Summit the recommended content type for ECMAScript files was application/javascript and now the recommended content type is text/javascript. However, RFC 4329 from 2006 recommends application/javascript. Sonar should at least note which browsers are still not compatible with this standard. By the way, ASP.NET Core 2 serves files with an .js extension as application/javascript.
I don’t think that the X-Content-Type-Options HTTP header must be specified when the server sends static files which are not user generated.
I don’t think that the Strict-Transport-Security HTTP header must be specified when the server sends styles, scripts or images. The server redirects user from HTTP to HTTP+TLS and sets HSTS header during the first request. Styles, scripts and images are downloaded afterwards so the HSTS header has no longer in effect. The situation is different when these resources are loaded from another domain. Sonar should distinguish these two cases and show an error only when the domain is different.
Why CSS or ECMAScript files should have the charset=utf-8 parameter in the Content-Type header? Let’s suppose that the agent ignores BOM or the server has an UTF-8 encoded file without BOM. It is an issue only when the file contains non-ASCII characters. Sonar should take this into account.
Sonar will not show a warning when the site uses content compression on an encrypted connection when chunked transfer encoding isn’t used. These 3 conditions make the site vulnerable to the BREACH attack.
I agree that Server and X-Powered-By headers are unnecessary and could cause a security risk but why Azure Web Apps are configured to send them and why the default template for ASP.NET (Core) apps doesn’t remove them in a web.config file?
I don’t think that serving an image file which can be 44% smaller is an error. First, Sonar should detect this as a warning. Second, why the System.Drawing.Bitmap encodes images into PNG format with this terrible effectivity?
Conclusion
Sonar is an ambitious project which have a similar potential like Lighthouse. On the order side, Sonar seems to be in early stage of development. It needs to be a first citizen in F12 Dev Tools because Google can assert its interpretation of standards via Lighthouse in Chrome DevTools. Finally, it should be adopted by Bing team which has a deep knowledge of the web and can improve Sonar’s quality to surpass Lighthouse.