kaszus 2 days ago

There is also CSS media now to detect if is enabled or not. It' s supported in last 2 versions of most popular browsers.

https://developer.mozilla.org/en-US/docs/Web/CSS/@media/scri...

  • 0xdade 2 days ago

    Damn I didn't know this. I don't know how my search failed to turn this up, because I was literally googling how to accomplish it with media queries lol.

  • gjsman-1000 2 days ago

    By the time you are on the last 2 versions of popular browsers, you’re at the point a developer can safely assume you have JavaScript, and can force you to use it.

    • zamadatix 2 days ago

      The article is written around supporting "users who have JavaScript off", which is an option on all popular browsers. I.e. it's about better supporting users who don't want to use JavaScript, not about supporting e.g. Internet Explorer 1.0 (though that could be helped by this kind of strategy too).

      • gjsman-1000 2 days ago

        Which in practice is an irrelevantly low number of people.

        Nowadays, unless you are serving developing markets, supporting legacy technology, or your website is truly just some basic static pages, it’s a buzzword programmers use to flex on each other. That’s about it.

        I personally find it annoying because I grew up as a junior right at the inflection point, when all the tutorials were still saying it was the right thing to do. That’s a lot of work to do for something nobody - and I mean literally nobody - in the real world cares about anymore. I would sooner brag about how my blog still works on Safari bundled with Snow Leopard.

        • zamadatix 2 days ago

          Or it's something fun for a hacker to blog about polishing progressive enhancement on. You're free to choose your own takeaway of course. Part of the experience of being a junior is learning when which type of approach makes sense rather than expecting anything one reads to directly apply to how their next project should best be done, regardless if it's popular or unpopular. Not everything needs to be practical for most people to be worth writing about!

        • sneak 2 days ago

          I am not nobody. Neither is Ed Snowden. Neither are the people who trained him, who regularly use javascript to exploit browsers.

          Many of us that know how security works on the modern web browse untrusted sites without JS enabled.

          • userbinator 2 days ago

            This. The vast majority of browser exploits are in code related to the JS engine and its accompanying huge API surfaces, and even those rare ones which don't require it are often obfuscated/hidden using JS.

            Exploits aside, the amount of user-hostile irritating annoyances that just disappear without JS is also worth mentioning. Many years ago, I remember a site that was completely usable, yet continually begged me to "enable JavaScript for a better experience!", so I tried it and was immediately inundated with popups, moving flashing crap all over the page, and other things that I did NOT consider a "better experience". Never again.

        • perching_aix 2 days ago

          You mean the real world that can barely distinguish their browser from "Google"? Sounds like a very fair benchmark indeed.

          Did you know that HackerNews is 99% functional with JS turned off by the way? The only thing missing is the ability to collapse threads.

  • lolinder 2 days ago

    It's still only available on about 90% of global browser traffic. If you're going out of your way to support noscript smoothly, you probably care about supporting more than just 90% of web traffic.

    https://caniuse.com/mdn-css_at-rules_media_scripting

    • riedel 2 days ago

      I would guess even in absolute terms there will be more browsers with JavaScript disabled within the remaining 10%

      • ffsm8 2 days ago

        You think people that keep JS disabled will on average use older browsers?

        Idk, I honestly doubt that. In my experience the people using outdated browsers are on old mobile devices and some "special" enterprises micromanaging employee software installs

        And technologically illiterate apple users (because Safari doesn't Auto update like Chrome and Firefox)

theandrewbailey 2 days ago

> We’ll simply create a single class, d-js-required to indicate that JavaScript needs to be enabled for us to display this element.

You can run some JS to prove you're running JS. Can't you write in your site's CSS:

        .d-js-required { display: none; }
and (in JS) remove that class on each element it's on?
  • 0xdade 2 days ago

    This is absolutely an option! It gets a little tricky to avoid shifting content around, since it's pretty typical to load styles in the head but load javascript either at the end of the DOM or with the defer tag, so the javascript would likely be running after the user has already seen the layout, and layout shifts could be clunky.

  • c0balt 2 days ago

    Wouldn't this lead to a delay of time to first content render? If you rely on display then a layout shift would also be an issue for JS-capable clients.

    • charrondev 2 days ago

      Write that tiny of bit of JS inline in the head and put the class/attribute on the html element.

grishka 2 days ago

I did it the other way around. I put this at the start of <body>:

    <script>document.body.classList.add("hasJS");</script>
And the corresponding CSS:

    .js{display: none;}
    .hasJS .js{display: block;}
  • MrJohz 2 days ago

    This has the advantage compared to OP's solution that it will also work if JS doesn't load correctly (as opposed to if it is blocked/deliberately not loaded). You can add this line to the end of the main JS file that you're including on your page, and it will ensure that the JS "switch" will only get triggered if that file could be downloaded and parsed.

jaredcwhite 2 days ago

Another solution: wrap the feature in a custom element, use CSS :not(:defined) to hide the element, then in JS register the custom element so it's defined and the CSS no longer applies.

userbinator 2 days ago

And if not, you can just copy the link down below to share this page!

Or you could just copy it from the address bar?

  • simonw 2 days ago

    I'm beginning to suspect that a substantial portion of users don't habitually do that, for two reasons:

    1. They don't understand URLs. That's not too surprising to be honest, URLs are pretty complex things.

    2. They've been burned enough times by crap SPAs that they don't instantly assume something they are looking at has a dependable URL they can copy and share.

    Thanks to 2 I've recently started rethinking a bunch of my own projects - I think I need to add explicit "share" buttons that copy URLs to people's clipboards for them.

    • 0xdade 2 days ago

      The share button I have uses the navigator.share API if it's available, or tries to fall back to navigator.clipboard. Unfortunately I think I didn't do adequate testing to make sure the clipboard feature works in whatever edge cases you might have the clipboard API but not the Share API, so I'm pretty sure _that_ is broken.

      I really just put the URL at the end of the page with a `user-select: all;` to make it easy to copy _after_ you've read the content. It's also rendered server-side so even if I used utm or some other tracking things, the query params would automatically not be included in the server-rendered share link, just the permalink to the post.

      But to the original authors point, I also do find myself often just copying from the address bar and then manually deleting a bunch of the garbage at the end of URLs. Maybe that's why I thought it nice for a simple "copy this link" when JS is disabled.

bshacklett 2 days ago

Major props to the author for actually caring about this. JS seems like an assumption more than ever before. It’s great to see that people understand that some of us don’t want it.

Very loosely related: why do so many websites use require Javascript to handle links? I’m so tired of trying to figure out how to open a given link in a new tab. :-(

  • bombela 2 days ago

    and scrolling. and history.

    I am finding myself afraid of clicking any link, anchor, or button in the fear of losing the current position, word wrapping or the current form inputs.

    It's like I am being pavlov trained to be afraid of interacting with my computer.

    • cosmic_cheese 2 days ago

      Mystery meat interaction is one of the worst parts of the modern web, and all too often it’s found in places where there is little to no user benefit to be had for the tradeoff and a plain old HTML link or button would’ve sufficed. I wish more devs would give thought to this kind of thing.

efortis 2 days ago

Another option is using JavaScript to inject elements that need JS.

  • bsza 2 days ago

    Doesn’t need css, only adds placeholder divs to the “vanilla” html without unnecessarily downloading what would be inside. Simple and elegant, I like it.

giantrobot 2 days ago

Reasons to have JavaScript fallbacks in your pages:

- Microbrowsers exist. They're the things that generate previews in messaging apps and social media.

- Ad blockers. Because the advertising on the Web has become a complete dumpster fire even the FBI recommends ad blockers. They can break stupid JavaScript functionality. Partially loaded JavaScript can cause additional load on the back end with retries or missing completion conditions.

- Crappy cellular. Cellular data remains unreliable and its reliability is not evenly distributed. A user can go from a great signal to shitty back to great through no fault of their own. You can't be guaranteed all your shitty JavaScript will even load.

- Spiders. Spiders have varying amounts of support for JavaScript, not everything is some variation on Chrome.

winterbloom 2 days ago

I don't get it, we should simply not support users that don't have JavaScript enabled. The vast majority of users/consumers will have it enabled

  • theandrewbailey 2 days ago

    We should simply not support developers that re-implement links, buttons, and text boxes entirely in Javascript. Such false elements lack inbuilt functionality (particularly with regards to accessibility), and are slower.

  • quotemstr 2 days ago

    For some people, it's about not breaking the web platform in ways that require JavaScript. For example, without JavaScript, links have predictable behavior. With it, you have link shaped buttons that can do anything and that often break when you try to do things the author didn't anticipate, like opening a link in a new tab.

  • userbinator 2 days ago

    Requiring JS (and especially if you trendchase the latest "modern" crap) is essentially advocating for keeping Big Browser's monopoly and effective control intact.

  • blahaj 2 days ago

    We should simply not support websites that download huge arbitrary programs from a huge number of arbitrary third party domains just to retrieve bit by bit what is in the end a static website and reimplement all kinds of standardized browser features in a non standard way that makes a huge number of assumptions and only targets a subset of people with specific browsers with specific settings using specific input devices in a specific way and just break that functionality for everyone else making the open web very closed by enforcing clients behave and be used in a certain pre approved way, breaking accessibility and crippling all kinds of browser features often with disregard and often with intention.

    It's my device and I should get to decide what code I run in it, what software I use, how that software displays content to me and how I interact with it.

  • perching_aix 2 days ago

    On the contrary, I don't get why (certain) sites - and people - absolutely insist on serving even the most basic things via copious amounts of JavaScript, and other adjacent technologies. Most website content is basic multimedia, text, images and video, and most use of JavaScript is either frivolous or is supporting user-hostile designs, such as infinite scroll.

    Reminds me to the WordPress / PHP scourge of the 2010s. "You don't understand, we have to dynamically serve our site using PHP, and have an extensive CMS that we can barely use ourselves so we'll will still constantly reach out to you about." And then the site is barely more than a brochure or a blog.

  • gjsman-1000 2 days ago

    Correct. As an HN commentator put it in 2023:

    “[Progressive enhancement] was just an idea by a few niche people who wanted to disable 1/3 to 2/3 of their browser's technology then demand that web-site operators/builders spend a lot of time and money on implementing for that sub-1% population. They didn't, and these people had no real leverage.”

    https://news.ycombinator.com/item?id=37779408

    • Grumbledour 2 days ago

      This is such a bad and reductionist take.

      When progressive enhancement was the thing, nobody wanted to "disable 1/3 to 2/3 of their browser's technology", browsers mostly lacked many of modern CSS and JS enhancements across the board and having them fail graciously while still using modern features for the few who would support it was just the professional thing to do.

      It sill is today of course, but it is obvious there are not many professionals left in webdev.

gjsman-1000 2 days ago

Outside of Hacker News or niche companies still supporting IE, progressive enhancement is completely dead.

Even for my own projects, I don’t care anymore; there’s a warning to turn it on with a <noscript> tag, and the whole page is otherwise hidden.

HN had the eulogy (with some kicking and screaming that it was premature) back in 2013. https://news.ycombinator.com/item?id=6316516

  • simonw 2 days ago

    It's only dead among people who don't care about productively building fast loading, reliable websites.

    (Which apparently is most web developers these days. Sigh.)

  • jaredcwhite 2 days ago

    "Progressive enhancement is completely dead" is in fact something I only expect to see posted on Hacker News. rolls eyes

    Meanwhile, out on the open web, people build progressively enhanced solutions all the time.

    • rikroots 2 days ago

      Two places in particular where people need to think about progressive enhancement:

      + Low-powered or -specified devices don't like Javascript-heavy elements. For example, canvas elements - do people with steam-powered mobile phones really need to see that JS-heavy canvas animation? Wouldn't a static image do just as well for them? Same goes for intensely animated SVG scenes.

      + Apple News. JS is not a thing in AN. If you need to press a web article so it can be distributed via AN, be prepared to rip every shred of JS out of it - something that's a lot easier to do if your article has already been built with progressive enhancement in mind.

    • hinkley 2 days ago

      Not to mention anything that’s been “dead” for a dozen years tends to be resurrected either by people who know or by people reinventing a wheel. We all live on a pendulum.

    • gjsman-1000 2 days ago

      Sure, and I do too, when it makes sense and is simple enough. A marketing page with one form? Why not?

      But a CRUD-heavy app with forms and modals and many moving parts? Forget it. It’s about as relevant as people using IE.

      2 years ago, again on HN, someone asked if progressive enhancement “lost the war.” In reality, it never showed up for battle on its own merits.

      https://news.ycombinator.com/item?id=37779408