There’s some really interesting conversation going on about IE8 these days.

Basically, it sounds like Microsoft took a thrashing when it released IE7, an imperfect but much more standards-compliant browser than IE6, and all those sites that had been built or hacked or conditionally commented to make them work properly in IE6 were suddenly ‘broken’ in the eyes of their owners.

Microsoft doesn’t want this to happen again when they release Internet Explorer 8 (even though the changes from IE7 to 8 will be less extensive than from 6 to 7). From what I’ve read (and I admit it’s not a lot), IE8 is going to be quite good. It follows the CSS 2.1 specification closely and sounds very promising!

But what can they really do to prevent ‘breakage’ of older sites built for less-compliant browsers?

Aaron Gustafson puts it this way:

“…We’re really only left with one option for guaranteeing a site we build today will look as good and work as well in five years as it does today: define a list of browser versions that the site was built and tested on, and then require that browser makers implement a way to use legacy rendering and scripting engines to display the site as it was intended—well into the future.”

Meaning that future (presumably better and even more standards-compliant) versions of browsers will be required to include all the old rendering and scripting information to show sites that are keyed to a particular browser version correctly.

So if you have a site that you want to display in IE6’s rendering styles forever (I don’t know why you would, but just for the sake of argument), you could add a metatag that said so. This would mean that the company you built the site for would never have to update it no matter what changes happen in IE down the road. Their site would never appear to have ‘broken’ because it would still render in the non-standard browser as intended.

Personally, I have a problem with this. As a standards advocate and someone who lives and breathes web design, this bothers me. Technology improves and things change.

I know there are many many many many designers who don’t care (or even know much about) about standards. This will probably always be true, and that’s okay for them (it makes us look better). But aren’t you kind of short-changing your clients if you build a site and lock it in to the current version of a browser? What if some great new capability is developed that could just blow your client away – but they can’t have it because that old browser doesn’t support it?

This is just my opinion. But for me, it’s not good enough that a business site that I build worked last year. I want it to work in perpetuity. I like improving my skills and what I can offer my clients. I enjoy being able to validate my code (yes I do). It’s part of being a professional; again, my opinion.

And I think that for a business, adapting to change is important. If you have a website that’s three years old and breaks in IE7, I think you should fork over the funds and time to get it updated. It’s a cost of doing business.

Being able to freeze an older, non-compliant site in time forever is kind of like having your own wayback machine and it seems to run counter to the very reason for being online in the first place… I totally agree with one of Eric Meyer’s statements:

“Thus, as a developer, there’s no need to look beyond the current state of browsers. I can just assume that browsers will always support what I’ve done even if it’s the worst kind of short-sighted, browser-specific, who-needs-standards-anyway type of development possible.”

I’m not entirely convince that this is a terrible move and my opinion might change. But I know that I would not be satisfied to have working websites out there that were functioning on a level three years old.

And, finally, it seems to me that if IE10 is to support all the bugs and sloppy rendering of all the versions of IE that went before it, won’t it be ginormous amount of code? Is this even reasonable? Will it affect the function and speed of a browser (I have no idea).

And if IE starts doing it and finds it works well, will the other browser vendors start doing it too? Although I see less reason, since most other browsers have been closer to compliant for far longer…

Lots to consider on the web development front.

I read a post in a mailing list today that made me think. I know that the homepages for the big search engines are full of errors. They don’t validate – and it doesn’t matter. They’re not hurting for it. It makes not one iota of difference as far as search engines go whether code validates.

I try (I don’t always succeed, but I always try) to make sure my code is valid for both HTML and CSS. Why do I do this? Because I feel it’s the right thing to do.

I know that there are usually many ways to accomplish the same result when you’re building a web site. I get that, and I like that. But I with my idealist viewpoint believe that there are ways that are more clean, less intrusive, easier to later understand and change than others.

Take for instance CSS hacks. When I first began using CSS heavily, I used hacks. But since I discovered conditional comments, I very rarely use a hack (the last one was about four months ago and specific to Opera). I like conditional comments because they seem to me to be a cleaner, more correct solution than toying with presentational elements in CSS that may later reveal incompatibilities as new browser versions are released (look at the list of CSS hacks that stopped working in IE7).

I met a designer last week whose work I love – he’s a very talented individual and a standards advocate as I am. We were talking about a common issue in CSS and I asked how he got around that – he agreed that conditional comments were the way to go but that he defined them within the stylesheet and not in the actual HTML page. He whipped out his laptop and showed me what he meant.

I looked at the code and thought ‘that’s a hack’ even as he said ‘this is a conditional comment.’ He supposed that a purist wouldn’t agree with his labeling.

I’m a purist. I think hacks are messy, they’re more like bandaids than real solutions. For me, conditional comments are the cleaner fix.

I validate because I care very much that my code is clean. I define clean, in part, as being error-free, and validating helps me accomplish that, most of the time. I know it doesn’t matter to search engines, nor to the vast majority of my clients. But I consider my work as a web designer/developer to be a craft – and it matters to me.

I just read one fricking fantastic post by Andy Rutledge on web quality.

This is great – it defines the ongoing battle between designers who support web standards, and those who ignore/shun/berate them, as basically being a matter of semantics. That is, web standards are not about standardization or compliance, as some would maintain; web standards are really about the quality of web craft. ‘Standard’ is a high level of quality to strive for, not the confining destroyer of uniqueness in design that some would have us believe.

This is so simple and really gets to the heart of the matter, but I never thought of explaining it this way.

Andy also talks about the misuse of the search engines. Search engines being a completely essential tool for correctly organizing all that information on the web and being able to call it up – correctly – when that information is sought – this is the purpose of a search engine; like an enormous virtual card catalog. Andy says that it’s “irresponsible to hinder this worthy task” by viewing the search engines as only things to be manipulated or exploited for commercial or personal gain.

I highly recommend that all the web designers I know go read this article. Quality is what being a professional is all about!