Ask HN: Abandoned/dead projects you think died before their time and why?
299 points by ofalkaed a day ago
299 points by ofalkaed a day ago
Just curious and who knows, maybe someone will adopt it or develop something new based on its ideas.
The Plan 9 operating system. It's the closest thing to a Unix successor we ever got, taking the "everything is a file" philosophy to another level and allowing to easily share those files over the network to build distributed systems. Accessing any remote resources is easy and robust on Plan9, meanwhile on other systems we need to install specialized software with bad interoperability for each individual use case. Plan9 also had some innovative UI features, such as mouse chording to edit text, nested window managers, the Plumber to run user-configurable commands on known text patterns system-wide, etc. Its distributed nature should have meant it's perfect for today's world with mobile, desktop, cloud, and IoT devices all connected to each other. Instead, we're stuck with operating systems that were never designed for that. There are still active forks of Plan9 such as 9front, but the original from Bell Labs is dead. The reasons it died are likely: - Legal challenges (Plan9 license, pointless lawsuits, etc.) meant it wssn't adopted by major players in the industry. - Plan9 was a distributed OS during a time when having a local computer became popular and affordable, while using a terminal to access a centrally managed computer fell out of fashion (though the latter sort of came back in a worse fashion with cloud computing). - Bad marketing and posing itself as merely a research OS meant they couldn't capitalize on the .com boom. - AT&T lost its near endless source of telephone revenue. Bell Labs was sold multiple times over the coming years, a lot of the Unix/Plan9 guys went to other companies like Google. > The reasons it died are likely: The reason Plan 9 died a swift death was that, unlike Unix –
which hardware manufacturers could license for a song and adapt to their own hardware (and be guaranteed compatibility with lots of Unix software) – Bell Labs tried to sell Plan 9, as commercial software, for $350 a box. (As I have written many times in the past: <https://news.ycombinator.com/item?id=22412539>, <https://news.ycombinator.com/item?id=33937087>, and <https://news.ycombinator.com/item?id=43641480>) Version 1 was never licensed to anyone. Version 2 was only licensed to universities for an undiscolsed price. Version 3 was sold as a book, I think this is the version you are referring to. However note that this version contained a license that only allowed non commercial uses of the source code. It also came with no support, no community and no planned updates (the project was shelved half a year later in favor of inferno) More than the price tag the problem is that plan 9 wasn't really released until 2004. Strictly speaking, it's not dead. The code is now open source and all the rights are with the Plan 9 foundation: https://p9f.org/ It's just unlikely that it will get as big of a following as Linux has. Plan 9 Filesystem Protocol lives on inside WSL2. 9P is used everywhere in the VM ecosystem. It's clean and simple and well supported by almost all guests. What’s stopping other Unix-like systems from adopting the everything is a file philosophy? The fact that everything is not a file. No OS actually implements that idea including Plan9. For example, directories are not files. Plan9 re-uses a few of the APIs for them, but you can't use write() on a directory, you can only read them. Pretending everything is a file was never a good idea and is based on an untrue understanding of computing. The everything-is-an-object phase the industry went through was much closer to reality. Consider how you represent a GUI window as a file. A file is just a flat byte array at heart, so: 1. What's the data format inside the file? Is it a raw bitmap? Series of rendering instructions? How do you communicate that to the window server, or vice-versa? What about ancillary data like window border styles? 2. Is the file a real file on a real filesystem, or is it an entry in a virtual file system? If the latter then you often lose a lot of the basic features that makes "everything is a file" attractive, like the ability to move files around or arrange them in a user controlled directory hierarchy. VFS like procfs are pretty limited. You can't even add your own entries like adding symlinks to procfs directories. 3. How do you receive callbacks about your window? At this point you start to conclude that you can't use one file to represent a useful object like a window, you'd need at least a data and a control file where the latter is some sort of socket speaking some sort of RPC protocol. But now you have an atomicity problem. 4. What exactly is the benefit again? You won't be able to use the shell to do much with these window files. And so on. For this reason Plan9's GUI API looked similar to that of any other OS: a C library that wrapped the underlying file "protocol". Developers didn't interact with the system using the file metaphor, because it didn't deliver value. All the post-UNIX operating system designs ignored this idea because it was just a bad one. Microsoft invested heavily in COM and NeXT invested in the idea of typed, IDL-defined Mach ports. Probably that not everything can be cleanly abstracted as a file. One might want to, e. G., have fine control over a how a network connection is handled. You can abstract that as a file but it becomes increasingly complicated and can make API design painful. > Probably that not everything can be cleanly abstracted as a file. I would say almost nothing can be cleanly abstracted as a file. That’s why we got ioctl (https://en.wikipedia.org/wiki/Ioctl), which is a bad API (calls mean “do something with this file descriptor” with only conventions introducing some consistency) Everything can be abstracted as a file, it just may not be most efficient interface. If everything can be represented as a Foo or as a Bar, then this actually clears up the discussion, allowing the relative merits of each representation to be discussed. If something is a universal paradigm, all the better to compare it to alternatives, because one will likely be settled on (and then mottled with hacks over time; organic abstraction sprawl FTW). They have to an extent. The /proc file system on Linux is directly inspired by plan 9 IIRC. Other things like network sockets never got that far and are more related to their BSD kin. There's also /dev/tcp in Linux /dev/tcp does not exist in Linux. Probably the fact that it's a pretty terrible idea. It means you take a normal properly typed API and smush it down into some poorly specified text format that you now have to write probably-broken parsers for. I often find bugs in programs that interact with `/proc` on Linux because they don't expect some output (e.g. spaces in paths, or optional entries). The only reasons people think it's a good idea in the first place is a) every programming language can read files so it sort of gives you an API that works with any language (but a really bad one), and b) it's easy to poke around in from the command line. Essentially it's a hacky cop-out for a proper language-neutral API system. In fairness it's not like Linux actually came up with a better alternative. I think the closest is probably DBus which isn't exactly the same. Maybe something like FIDL is a proper solution but I have only read a little about it: https://fuchsia.dev/fuchsia-src/get-started/learn/fidl/fidl I think you have to standardize a basic object system and then allow people to build opt-in interfaces on top, because any single-level abstraction will quickly be pulled in countless directions for as many users. - Photon, the graphical interface for QNX. Oriented more towards real time (widgets included gauges) but good enough to support two different web browsers. No delays. This was a real time operating system. - MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs. - Transaction processing operating systems. The first one was IBM's Customer Information Control System. A transaction processor is a kind of OS where everything is like a CGI program - load program, do something, exit program. Unix and Linux are, underneath, terminal oriented time sharing systems. - IBM MicroChannel. Early minicomputer and microcomputer designers thought "bus", where peripherals can talk to memory and peripherals look like memory to the CPU. Mainframes, though, had "channels", simple processors which connected peripherals to the CPU. Channels could run simple channel programs, and managed device access to memory. IBM tried to introduce that with the PS2, but they made it proprietary and that failed in the marketplace. Today, everything has something like channels, but they're not a unified interface concept that simplifies the OS. - CPUs that really hypervise properly. That is, virtual execution environments look just like real ones.
IBM did that in VM, and it worked well because channels are a good abstraction for both a real machine and a VM. Storing into device registers to make things happen is not. x86 has added several layers below the "real machine" layer, and they're all hacks. - The Motorola 680x0 series. Should have been the foundation of the microcomputer era, but it took way too long to get the MMU out the door. The original 68000 came out in 1978, but then Motorola fell behind. - Modula. Modula 2 and 3 were reasonably good languages. Oberon was a flop. DEC was into Modula, but Modula went down with DEC. - XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman.
Would it kill people to have to close their tags properly? - Word Lens. Look at the world through your phone, and text is translated, standalone, on the device. No Internet connection required. Killed by Google in favor of hosted Google Translate. > MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs. You have things backwards. The Copland project was horribly mismanaged. Anybody at Apple who came up with a new technology got it included in Copland, with no regard to feature creep or stability. There's a leaked build floating around from shortly before the project was cancelled. It's extremely unstable and even using basic desktop functionality causes hangs and crashes. In mid-late 1996, it became clear that Copland would never ship, and Apple decided the best course of action was to license an outside OS. They considered options such as Solaris, Windows NT, and BeOS, but of course ended up buying NeXT. Copland wasn't killed to justify buying NeXT, Apple bought NeXT because Copland was unshippable. I was all gung ho on XHTML back in the day until I realized that a single unclosed tag in an ad or another portion of our app that I had no control over would cause the entire page to fail. The user would see nothing except a giant ugly error. And your solution of rendering the rest of the page in Times New Roman isn’t an option. Do you try to maintain any of the HTML semantics or just render plain text? If it’s plain text, that’s useless. If you’re rendering anything with any semantics, then you need to know how to parse it. You’re back where you started. Granted, I could ensure that my code was valid XHTML, but I’m a hypermeticulous autistic weirdo, and most other people aren’t. As much as XHTML “made sense”, it was completely unworkable in reality, because most people are slobs. Sometimes, worse really is better. if the world was all XHTML, then you wouldn't put an ad on your site that wasn't valid XHTML, the same way you wouldn't import a python library that's not valid python. >, then you wouldn't put an ad on your site that wasn't valid XHTML, You're overlooking how incentives and motivations work. The gp (and their employer) wants to integrate the advertisement snippet -- even with broken XHTML -- because they receive money for it. The semantic data ("advertiser's message") is more important than the format ("purity of perfect XHTML"). Same incentives would happen with a jobs listing website like Monster.com. Consider that it currently has lots of red errors with incorrect HTML: https://validator.w3.org/nu/?doc=https%3A%2F%2Fwww.monster.c... If there was a hypothetical browser that refused to load that Monster.com webpage full of errors because it's for the users' own good and the "good of the ecosystem"... the websurfers would perceive that web browser as user-hostile and would choose another browser that would be forgiving of those errors and just load the page. Job hunters care more about the raw data of the actual job listings so they can get a paycheck rather than invalid <style> tags nested inside <div> tags. Those situations above are a different category (semantic_content-overrides-fileformatsyntax) than a developer trying to import a Python library with invalid syntax (fileformatsyntax-Is-The-Semantic_Content). EDIT reply to: >Make the advertisement block an iframe [...] If the advertiser delivers invalid XHTML code, only the advertisement won't render. You're proposing a "technical solution" to avoid errors instead of a "business solution" to achieve a desired monetary objective. To re-iterate, they want to render the invalid XHTML code so your idea to just not render it is the opposite of the goal. In other words, if rendering imperfect-HTML helps the business goal more than blanking out invalid XHTML in an iframe, that means HTML "wins" in the marketplace of ideas. If xhtml really took off, there would just be server side linting/html tidy. Its not that hard a problem to solve. Lots of websites already do this for user generated html, because even if an unclosed div doesnt take down the whole thing its still ugly. The real problem is the benefits of xhtml are largely imaginary so there isn't really a motivation to do that work. > You're overlooking how incentives and motivations work. The gp (and their employer) wants to integrate the advertisement snippet -- even with broken XHTML -- because they receive money for it. Make the advertisement block an iframe with the src attribute set to the advertiser's URL. If the advertiser delivers invalid XHTML code, only the advertisement won't render. But all it takes in that world is for a single browser vendor to decide - hey, we will even render broken XHTML, because we would rather show something than nothing - and you’re back to square one. I know which I, as a user, would prefer. I want to use a browser which lets me see the website, not just a parse error. I don’t care if the code is correct. In practice things like that did happen, though. e.g. this story of someone's website displaying user-generated content with a character outside their declared character set: https://web.archive.org/web/20060420051806/http://diveintoma... Yes, you would be able to put an ad on your site that wasn't XHTML, because XHTML is just text parsed in the browser at runtime. And yes, that would fail, silently, or with a cryptic error >- XHTML. [...] Would it kill people to have to close their tags properly? XHTML appeals to the intuition that there should be a Strict Right Way To Do Things ... but you can't use that unforgiving framework for web documents that are widely shared. The "real world" has 2 types of file formats: (1) file types where consumers cannot contact/control/punish the authors (open-loop) : HTML, pdf, zip, csv, etc. The common theme is that the data itself is more important that the file format. That's why Adobe Reader will read malformed pdf files written by buggy PDF libraries. And both 7-Zip and Winrar can read malformed zip files with broken headers (because some old buggy Java libraries wrote bad zip files). MS Excel can import malformed csv files. E.g. the Citi bank export to csv wrote a malformed file and it was desirable that MS Excel imported it anyway because the raw data of dollar amounts was more important than the incorrect commas in the csv file -- and -- I have no way of contacting the programmer at Citi to tell them to fix their buggy code that created the bad csv file. (2) file types where the consumer can control the author (closed-loop): programming language source code like .c, .java, etc or business interchange documents like EDI. There's no need to have a "lenient forgiving" gcc/clang compiler to parse ".c" source code because the "consumer-and-author" will be the same person. I.e. the developer sees the compiler stop at a syntax error so they edit and fix it and try to re-compile. For business interchange formats like EDI, a company like Walmart can tell the vendor to fix their broken EDI files. XHTML wants to be in group (2) but web surfers can't control all the authors of .html so that's why lenient parsing of HTML "wins". XHTML would work better in a "closed-loop" environment such as a company writing internal documentation for its employees. E.g. an employee handbook can be written in strict XHTML because both the consumers and authors work at the same company. E.g. can't see the vacation policy because the XHTML syntax is wrong?!? Get on the Slack channel and tell the programmer or content author to fix it. The problem is that group (1) results in a nightmarish race-to-the-bottom. File creators have zero incentive to create spec-compliant files, because there's no penalty for creating corrupted files. In practice this means a large proportion of documents are going to end up corrupt. Does it open in Chrome? Great, ship it! The file format is no longer the specification, but it has now become a wild guess at whatever weird garbage the incumbent is still willing to accept. This makes it virtually impossible to write a new parser, because the file format suddenly has no specification. On the other hand, imagine a world where Chrome would slowly start to phase out its quirks modes. Something like a yellow address bar and a "Chrome cannot guarantee the safety of your data on this website, as the website is malformed" warning message. Turn it into a red bar and a "click to continue" after 10 years, remove it altogether after 20 years. Suddenly it's no longer that one weird customer who is complaining, but everyone - including your manager. Your mistakes are painfully obvious during development, so you have a pretty good incentive to properly follow the spec. You make a mistake on a prominent page and the CTO sees it? Well, guess you'll be adding an XHTML validator to your CI pipeline next week! It is very tempting to write a lenient parser when you are just one small fish in a big ecosystem, but over time it will inevitably lead to the degradation of that very ecosystem. You need some kind of standards body to publish a validating reference parser. And like it or not, Chrome is big enough that it can act as one for HTML. >File creators have zero incentive to create spec-compliant files, because there's no penalty for creating corrupted files This depends. If you are a small creator with a unique corruption then you're likely out of luck. The problem with big creators is 'fuck you' I do what I want. >"Chrome cannot guarantee the safety of your data on this website, as the website is malformed" warning message. This would appear on pretty much every website. And it would appear on websites that are no longer updated and they'd functionally disappear from any updated browser. In addition the 10-20 year thing just won't work in US companies, simply put if they get too much pressure next quarter on it, it's gone. >Your mistakes are painfully obvious during development, Except this isn't how a huge number of websites work. They get html from many sources and possibly libraries. Simply put no one is going to follow your insanity, hence why xhtml never worked in the first place. They'll drop Chrome before they drop the massive amount of existing and potential bugs out there. >And like it or not, Chrome is big enough that it can act as one for HTML. And hopefully in a few years between the EU and US someone will bust parts of them up. That would break decades of the web with no incentive for Google to do so. Plus, any change of that scale that they make is going to draw antitrust consideration from _somebody_. You’re right, but even standards bodies aren’t enough. At the end of the day, it’s always about what the dominant market leader will accept. The standard just gives your bitching about the corrupted files some abstract moral authority, but that’s about it. > That's why Adobe Reader will read malformed pdf files written by buggy PDF libraries. No, the reason is that Adobe’s implementation never bothered to perform much validation, and then couldn’t add strict validation retroactively because it would break too many existing documents. And it’s really the same for HTML. I’d argue a good comparison here is HTTPS. Everyone decided it would be good for sites to move over to serving via HTTPS so browsers incentivised people to move by gating newer features to HTTPS only. They could have easily done the same with XHTML had they wanted. The opportunities to fix this were pretty abundant. For instance, it would take exactly five words from Google to magically make a vast proportion of web pages valid XHTML: > We rank valid XHTML higher It doesn’t even have to be true! This is an argument for a repair function that transforms a broken document into a well-formed one without loss but keeps the spec small, simple and consistent. It's not an argument for baking malformations into a complex messy spec. > - XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly? Amen. Postel’s Law was wrong: https://datatracker.ietf.org/doc/html/rfc9413 We stop at the first sign of trouble for almost every other format, we do not need lax parsing for HTML. This has caused a multitude of security vulnerabilities and only makes it more difficult for pretty much everybody. The attitude towards HTML5 parsing seemed to grow out of this weird contrarianism that everybody who wanted to do better than whatever Internet Explorer did had their head in the clouds and that the role of a standard was just to write down all the bugs. Just to remind you that <bold> <italic> text </bold> </italic> [0] that has been working for ages in every browser ever, is NOT a valid XHTML, and should be rejected by GP's proposal. I, for one, is kinda happy that XHTML is dead. [0]: By <bold> I mean <b> and by <italic> I mean <i>, and the reason it's not valid HTML is that the order of closing is not reverse of the order of opening as it should properly be. That caused plenty of incompatibilities in the past. At one point, Internet Explorer would parse that and end up with something that wasn’t even a tree. HTML is not a set of instructions that you follow. It’s a terrible format if you treat it that way. It’s totally valid XHTML, just not recognized. XHTML allows you to use XML and <bold> <italic> are just XML nodes with no schema. The correct form has been and will always be <b> and <i>. Since the beginning. The problem there is the order of tags not their names. Ooooo… now we’re talking. Sloppy HTML that closes a tag out of order or just declared out of order? Or rendering bugs when bold is before italic? It’s why XHTML should have been standard. Just dump, error out, make the developer fix it. But the problem here is that our nice programmer-brained mental model does not match the actual requirements of text. Unless you know about tree structures, it doesn’t make sense to the average person why you would have to stop and then restart a span of formatting options just because an unrelated attribute changed. And that’s why XHTML failed - HTML is human-writable. I've edited my comment to better present the issue. Out of order closure should definitely error out with an “unclosed italic tag detected at line:…” error. > It’s totally valid XHTML, just not recognized. Am I right in assuming that even you didn't notice the problem the first time you looked at it? > Out of order closure should definitely error out Hitchens's razor: "What can be asserted without evidence can also be dismissed without evidence." In addition to Photon, I would say QNX itself (the desktop OS). I ran QNX 6 Neutrino on my PIII 450 back in the day, and the experience was so much more better than every other mainstream OS on the market. The thing that blew me away was how responsive the desktop was while multitasking, something Linux struggled with even decades later. Similarly, I'm also gutted that the QNX 1.44MB demo floppy didn't survive past the floppy era - they had some really good tech there. Imagine if they pitched it as a rescue/recovery OS for PCs, you could've run it entirely from the UEFI. Or say as an OS for smart TVs and other consumer smart devices. > Would it kill people to have to close their tags properly It would kill the approachability of the language. One of the joys of learning HTML when it tended to be hand-written was that if you made a mistake, you'd still see something just with distorted output. That was a lot more approachable for a lot of people who were put off "real" programming languages because they were overwhelmed by terrible error messages any time they missed a bracket or misspelled something. If you've learned to program in the last decade or two, you might not even realise just how bad compiler errors tended to be in most languages. The kind of thing where you could miss a bracket on line 47 but end up with a compiler error complaining about something 20 lines away. Rust ( in particular ) got everyone to bring up their game with respect to meaningful compiler errors. But in the days of XHTML? Error messages were arcane, you had to dive in to see what the problem actually was. If you forget a closing quote on an attribute in html, all content until next quote is ignored and not rendered - even if it is the rest of the page. I dont think this is more helpful than an error message. It was just simpler to implement. Let's say you forget to close a <b></b> element. What happens? Even today, after years of better error messages, the strict validator at https://validator.w3.org/check says: Back in the day, the error messages were even more misleading than this, often talking about "Extra content at end of document" or similar. Compare that to the very visual feedback of putting this exact document into a browser. You get more bold text than you were expecting, the bold just runs into the next text. That's a world of difference, especially for people who prefer visual feedback to reading and understanding errors in text form. Try it for yourself, save this document to a .html file and put it through the XHTML validator. You can have catastrophic parsing errors with the “lax” HTML too. For instance: For reference, observe what happens if you try opening this malformed document in a browser: save it with a .xhtml extension, or serve it with MIME type application/xhtml+xml. Firefox displays naught but the error: Thanks for showing these. We can see Firefox matches the same style of accurate but unhelpful error message. Chromium is much more helpful in the error message, directing the user to both line 19 and 22. It also made the user-friendly choice to render up to the error. In the context of XHTML, we should also keep in mind that Chrome post-dates XHTML by almost a decade. If, on the other hand, you have some sorts of XSLT errors, Firefox gives you a reasonably helpful error message in the dev tools, whereas Chromium gives you a blank document and nothing else… unless you ran it in a terminal. I’m still a little surprised that I managed to discover that it was emitting XSLT errors to stdout or stderr (don’t remember which). Really, neither has particularly great handling of errors in anything XML. None of it is better than minimally maintained, a lot of it has simply been unmaintained for a decade or more. > Rust ( in particular ) got everyone to bring up their game with respect to meaningful compiler errors. This was also part of the initial draw of `clang`. > Modula. Modula 2 and 3 were reasonably good languages. Oberon was a flop. DEC was into Modula, but Modula went down with DEC. If you appreciate Modula's design, take a look at Nim[1]. I remember reading the Wikipedia page for Modula-3[2] and thinking "huh, that's just like Nim" in every other section. > XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? I actually have, and its not that bad. If anything, the worst part is foreign content (svg, mathml) which have different rules more similar to xml but also not the same as xml. Just as an aside, browsers still support xhtml, just serve with application/xhtml+xml mime type, and it all works including aggressive error checking. This is very much a situation where consumers are voting with their feet not browser vendors forcing a choice. > Would it kill people to have to close their tags properly? Probably not, but what would be the benefit of having more pages fail to render? If xhtml had been coupled with some cool features which only worked in xhtml mode, it might have become successful, but on its own it does not provide much value. > but what would be the benefit of having more pages fail to render? I think those benefits are quite similar to having more programs failing to run (due to static and strong typing, other static analysis, and/or elimination of undefined behavior, for instance), or more data failing to be read (due to integrity checks and simply strict parsing): as a user, you get documents closer to valid ones (at least in the rough format), if anything at all, and additionally that discourages developers from shipping a mess. Then parsers (not just those in viewers, but anything that does processing) have a better chance to read and interpret those documents consistently, so even more things work predictably. Sure, authoring tools should help authors avoid mistakes and produce valid content. But the browser is a tool for the consumer of content, and there is no benefit for the user if it fails to to render some existing pages. It is like Windows jumping through hoops to support backwards compatibility even with buggy software. The interest of the customer is that the software runs. > there is no benefit for the user if it fails to to render some existing pages What if the browser renders it incorrectly? If a corrupt tag combination leads to browser X parsing "<script>" as inline text but browser Y parsing it as a script tag, that could lead to serious security issues! Blindly guessing at the original author's intent whenever you encounter buggy content is a recipe for disaster. Sometimes it is to the user's benefit to just refuse to render it. and that's why HTML5 standardized the behavior, so both browsers will parse it the same, they just don't care if someone thinks it's "invalid" or not. > Windows jumping through hoops to support backwards compatibility even with buggy software This was, maybe, true some 10 years ago.
Now even old Windows programs (paint,wordpad) do not run on newer Windows > The interest of the customer is that the software runs Yes, but testing is expensive and we are Agile. /s >Now even old Windows programs (paint,wordpad) do not run on newer Windows Eh, that's a really weird example as those are components of the operating system that are replaced with the OS upgrade. if developer accidentally left opening comment at the start of the html. Rhetorical question: Should the browser display page even if it is commented out? There is some bar for what is expected to work. If all browsers would consistently error out on unclosed tags, then it would definitely force developers to close tags, it would force it become common knowledge, second nature. HTML5 was the answer for the consistency part: where before browsers did different things to recover from "invalid" HTML, HTML5 standardizes it because it doesn't care about valid/invalid as much, it just describes behavior anyways. I used to run an RSS feed consolidator, badly formed XML was the bane of my life for a very long time. If devs couldn't even get RSS right, a web built on XHTML was a nonstarter. XHTML is XML. XML-based markup for content can be typeset into PDF, suitable for print media. I invite you to check out the PDFs listed in the intro to my feature matrix comparison page, all being sourced from XHTML: Nice list. Some thoughts: - I think without the move to NeXT, even if Jobs had come back to Apple, they would never have been able to get to the iPhone. iOS was - and still is - a unix-like OS, using unix-like philosophy, and I think that philosophy allowed them to build something game-changing compared to the SOTA in mobile OS technology at the time. So much so, Android follows suit. It doesn't have a command line, and installation is fine, so I'm not sure your line of reasoning holds strongly. One thing I think you might be hinting at though that is a missed trick: macOS today could learn a little from the way iOS and iPadOS is forced to do things and centralise configuration in a single place. - I think transaction processing operating systems have been reinvented today as "serverless". The load/execute/quit cycle you describe is how you build in AWS Lambdas, GCP Cloud Run Functions or Azure Functions. - Most of your other ideas (with an exception, see below), died either because of people trying to grab money rather than build cool tech, and arguably the free market decided to vote with its feet - I do wonder when we might next get a major change in hardware architectures again though, it does feel like we've now got "x86" and "ARM" and that's that for the next generation. - XHTML died because it was too hard for people to get stuff done. The forgiving nature of the HTML specs is a feature, not a bug. We shouldn't expect people to be experts at reading specs to publish on the web, nor should it need special software that gatekeeps the web. It needs to be scrappy, and messy and evolutionary, because it is a technology that serves people - we don't want people to serve the technology. > XHTML died because it was too hard for people to get stuff done. This is not true. The reason it died was because Internet Explorer 6 didn’t support it, and that hung around for about a decade and a half. There was no way for XHTML to succeed given that situation. The syntax errors that cause XHTML to stop parsing also cause JSX to stop parsing. If this kind of thing really were a problem, it would have killed React. People can deal with strict syntax. They can manage it with JSX, they can manage it with JSON, they can manage it with JavaScript, they can manage it with every back-end language like Python, PHP, Ruby, etc. The idea that people see XHTML being parsed strictly and give up has never had any truth to it. > The syntax errors that cause XHTML to stop parsing also cause JSX to stop parsing. If this kind of thing really were a problem, it would have killed React. JSX is processed during the build step, XHTML is processed at runtime, by the browser. They would have gotten another modern OS instead of Next as the base for MacOSX (then iOS). Another possibility they were exploring was buying BeOS, which would have been pretty interesting because it was an OS built from scratch in the 90's without any of the cruft from the 70's. Also, the only thing specific to Next that survived in MacOSX and iOS was ObjectiveC and the whole NextStep APIs, which honestly I don't think it a great thing. It was pretty cool in the 90's but when the iPhone was released it was already kinda obsolete. For the kernel, Linux or FreeBSD would have worked just the same. > without any of the cruft from the 70's By "cruft" you mean "lessons learned", right? Didn't Google already own Android when iOS was announced? Yes, and they were going to position it against Windows Mobile. When iOS was announced, Google scrambled to re-do the entire concept Not so much Windows Mobile, which never achieved serious market share. It was originally more planned to be a Blackberry competitor, and the early Android handset prototype concepts were all blackberry knockoffs with similar physical keyboard layouts. It has always appeared though like you suggest, that the project quickly pivoted to candy bar touch phones following the release of the original iPhone. It's worthwhile to remember that the industry wasn't nearly as convinced that touching glass was the future of mobile typing in 2007 as it later became, and the sales volume of Blackberrys back then was often incorrectly cited as evidence to support the case against touch. > https://www.bgr.com/tech/iphone-vs-android-original-google-b... I clean forgot about Blackberry :) Android team ended up delaying Android release by a year: https://appleinsider.com/articles/13/12/19/googles-reaction-... On XHTML, I think there was room for both HTML and a proper XHTML that barks on errors. If you're a human typing HTML or using a language where you build your HTML by concatenation like early PHP, sure it makes sense to allow loosey goosey HTML but if you're using any sort of simple DOM builder which should preclude you from the possibility of outputting invalid HTML, strict XHTML makes a lot more sense. Honestly I'm disappointed the promised XHTML5 never materialized along side HTML5. I guess it just lost steam. But a HTML5 parser will obviously parse "strict" HTML5 just fine too, what value is there to special-case the "this was generated by a DOM builder" path client-side? > Honestly I'm disappointed the promised XHTML5 never materialized along side HTML5. I guess it just lost steam. The HTML Standard supports two syntaxes, HTML and XML. All browsers support XML syntax just fine—always have, and probably always will. Serve your file as application/xhtml+xml, and go ham. +1 Copland BeOS. I like to daydream about an alternate reality where it was acquired by Sony, and used as the foundation for PlayStation, Sony smartphones, and eventually a viable alternative to Windows on their Vaio line. Neal Stephenson, https://web.stanford.edu/class/cs81n/command.txt : > Imagine a crossroads where four competing auto dealerships are situated… (Apple) sold motorized vehicles--expensive but attractively styled cars with their innards hermetically sealed, so that how they worked was something of a mystery. > (Microsoft) is much, much bigger… the big dealership came out with a full-fledged car: a colossal station wagon (Windows 95). It had all the aesthetic appeal of a Soviet worker housing block, it leaked oil and blew gaskets, and it was an enormous success. > On the other side of the road… (Be, Inc.) is selling fully operational Batmobiles (the BeOS). They are more beautiful and stylish even than the Euro-sedans, better designed, more technologically advanced, and at least as reliable as anything else on the market--and yet cheaper than the others. > … and Linux, which is right next door, and which is not a business at all. It's a bunch of RVs, yurts, tepees, and geodesic domes set up in a field and organized by consensus. The people who live there are making tanks. It would be years before OS X could handle things that wouldn’t cause BeOS to break a sweat, and BeOS still has a bit of a responsiveness edge that OS X still can't seem to match (probably due to the PDF rendering layer). > - XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly? IMO there's a place for XHTML as a generated output format, but I think HTML itself should stay easy to author and lightweight as a markup format. Specifically when it comes to tag omission, if I'm writing text I don't want to see a bunch of `</li>` or `</p>` everywhere. It's visual noise, and I just want a lightweight markup. > IBM MicroChannel. Early minicomputer and microcomputer designers thought "bus", where peripherals can talk to memory and peripherals look like memory to the CPU. Mainframes, though, had "channels", simple processors which connected peripherals to the CPU. TIL: what microchannel meant by micro and channel. Also it had OS independent device-class drivers. And you could stuff a new CPU on a card and pop it right in. Went from a 286+2MB to a 486dx2+32MB. Word lens team was bought by google, its far better in google translate then the local app ever was. You could repeat the old app with a local LLM now pretty easily but it still won't be as close in quality as using google translate CICS is still going strong as part of ZOS. There are industries where green screen, mainframe terminal apps still rule and CICS is driving them. CICS seems perfectly fine in problem spaces where requirements change slowly enough than one can trade development time for reliability (read: finance and insurance). CICS and HATS are perhaps the most annoying pieces of technology I’ve ever encountered. I love this mismatched list of grievances and I find myself agreeing with most of them. XHTML and proper CPU hypervisors in particular. People being too lazy to close the <br /> tag was apparently a gateway drug into absolute mayhem. Modern HTML is a cesspool. I would hate to have to write a parser that's tolerant enough to deal with all the garbage people throw at it. Is that part of the reason why we have so few browsers? > People being too lazy to close the <br /> tag was apparently a gateway drug into absolute mayhem. Your chronology is waaaaaaaaaaaay off. <BR> came years before XML was invented. It was a tag that didn’t permit children, so writing it <BR></BR> would have been crazy, and inventing a new syntax like <BR// or <BR/> would have been crazy too. Spelling it <BR> was the obvious and reasonable choice. The <br /> or <br/> spelling was added to HTML after XHTML had already basically lost, as a compatibility measure for porting back to HTML, since those enthusiastic about XHTML had taken to writing it and it was nice having a compatible spelling that did the same in both. (In XHTML you could also write <br></br>, but that was incorrect in HTML; and if you wrote <br /> in HTML it was equivalent to <br /="">, giving you one attribute with name "/" and value "". There were a few growing pains there, such as how <input checked> used to mean <input checked="checked">—it was actually the attribute name that was being omitted, not the value!—except… oh why am I even writing this, messy messy history stuff, engines doing their own thing blah blah blah, these days it’s <input checked="">. Really, the whole <… /> thing is more an artefact of an arguably-misguided idea after a failed reform. The absolute mayhem came first, not last. > I would hate to have to write a parser that's tolerant enough to deal with all the garbage people throw at it. The HTML parser is magnificent, by far the best spec for something reasonably-sized that I know of. It’s exhaustively defined in terms of state machines. It’s huge, far larger than one would like it to be because of all this compatibility stuff, but genuinely easy to implement if you have the patience. Seriously, go read it some time, it’s really quite approachable. > The <br /> or <br/> spelling was added to HTML after XHTML had already basically lost This is untrue. This is the first public draft of XHTML from 1998: > Include a space before the trailing / and > of empty elements, e.g. <br />, <hr /> and <img src="karen.jpg" alt="Karen" />. — https://www.w3.org/TR/1998/WD-html-in-xml-19981205/#guidelin... Not really. HTML5 parsing is very well documented and quite easy compared to all the other things a browser needs. The reason XHTML failed is because the spec required it to be sent with a new MIME type (application/xml+xhtml I believe) which no webserver did out of the box. Everything defaulted to text/html, which all browsers would interpret as HTML, and given the mismatching doctype, would interpret as tag soup (quirks mode/lenient). Meanwhile, local files with the doctype would be treated as XHTML, so people assumed the doctype was all you needed. So everyone who tried to use XHTML didn't realize that it would go back to being read as HTML when they upload it to their webserver/return it from PHP/etc. Then, when something went wrong/worked differently than expected, the author would blame XHTML. Edit: I see that I'm getting downvoted here; if any of this is factually incorrect I would like to be educated please. > The reason XHTML failed is because the spec required it to be sent with a new MIME type (application/xml+xhtml I believe) which no webserver did out of the box. Everything defaulted to text/html, which all browsers would interpret as HTML, and given the mismatching doctype, would interpret as tag soup (quirks mode/lenient). None of that is correct. It was perfectly spec. compliant to label XHTML as text/html. The spec. that covers this is RFC 2854 and it states: > The text/html media type is now defined by W3C Recommendations; the latest published version is [HTML401]. In addition, [XHTML1] defines a profile of use of XHTML which is compatible with HTML 4.01 and which may also be labeled as text/html. — https://datatracker.ietf.org/doc/html/rfc2854 There’s no spec. that says you need to parse XHTML served as text/html as HTML not XHTML. As the spec. says, text/html covers both HTML and XHTML. That’s something that browsers did but had no obligation to. The mismatched doctype didn’t trigger quirks mode. Browsers don’t care about that. The prologue could, but XHTML 1.0 Appendix C told you not to use that anyway. Even if it did trigger quirks mode, that makes no difference in terms of tag soup. Tag soup is when you mis-nest tags, for instance <strong><em></strong></em>. Quirks mode was predominantly about how it applied CSS layout. There are three different concepts being mixed up here: being parsed as HTML, parsing tag soup, and doctype switching. The problem with serving application/xhtml+xml wasn’t anything to do with web servers. The problem was that Internet Explorer 6 didn’t support it. After Microsoft won the browser wars, they discontinued development and there was a five year gap between Internet Explorer 6 and 7. Combined with long upgrade cycles and operating system requirements, this meant that Internet Explorer 6 had to be supported for almost 15 years globally. Obviously, if you can’t serve XHTML in a way browsers will parse as XML for a decade and a half, this inevitably kills XHTML. Isn't that what the <!DOCTYPE> tag was supposed to solve? Yes, I covered that; everyone assumed that you only needed to specify the doctype, but in practice browsers only accepted it for local files or HTTP responses with Content-Type: application/xml+xhtml. I've edited the comment to make that more explicit. Ah, I see. Yeah, that's a bit silly. They should've gone for "MUST have doctype, SHOULD have content type". Google Wave. Edit: you asked why. I first saw it at SELF where Chris DiBona showed it to me and a close friend. It was awesome. Real time translation, integration of various types of messaging, tons of cool capabilities, and it was fully open source. What made it out of Google was a stripped down version of what I was shown, the market rejected it, and it was a sad day. Now, I am left with JIRA, Slack, and email. It sucks. Google wave was built on an awesome technology layer, and they they totally blew in on the user interface.... deciding to treat it as a set of separate items instead of a single document everyone everywhere all at once could edit.... killed it. It make it seem needlessly complicated, and effectively erased all the positives. I think this is spot on.
A document metaphor would have made a Wave a lot easier to understand. I was blown away by the demo but then after I thought about it, it seemed like a nightmare to me. All the problems of slack of having to manually check channels for updates except X 100 (yea, I get that slack wasn't available then. My point is I saw that it seemed impossible to keep up with nested constantly updated hierarchical threads. Keeping up with channels on slack is bad enough so imagine if Wave had succeeded. It'd be even worse. Wave was great for conversation with one or two other people on a specific project, which I'm sure most people here used it for. I can't imagine it scaling well beyond that. Maybe one could have worked around that by embedding Yahoo Pipes, thus automating the X 100. Google Wave had awesome tech but if you look at the demo in hindsight you can tell it’s just not a very good product. They tried making an all-in-one kind of product which just doesn’t work. In a sense Wave still exists but was split into multiple products, so I wouldn’t say it’s “dead”. The tech that powered it is still used today in many of Google’s popular products. It turns out that having separate interfaces for separate purposes is just more user friendly than an all-in-one. I managed trips with friends and it was a great form factor for ad-hoc discussions with docs and links included. I thought it was the future and in my very early programming days wrote probably the most insecure plugin ever to manage your servers. https://github.com/shano/Wave-ServerAdmin It's been 16 years. I should probably archive this.. It was smoke and mirrors, spiced with everyone letting their imagination run away. I downloaded the open-source version of the server to see if I could build a product around it, but it came with a serious limitation: The open-source server did not persist any data. That was a complete non-starter for me. At that point I suspected it wasn't going anywhere. My suspicions were confirmed when I sat near some Wave team members at an event, and overhead one say, with stars in his eyes, "won't it be groovy when everyone's using Wave and..." --- Cool concept, though. Immediately thought of this. Even the watered-down version of wave was something I used at my host startup, it was effectively our project management tool. And it was amazing at that. I don't know how it would fare compared to the options available today, but back then, it shutting down was a tremendous loss. Isn't Nextcloud (including Nextcloud Talk) a viable alternative? Certainly, something like Discord (centralized and closed source) isn't. Discord is function wise the best now... I don’t get the downvotes.
Discord for all its flaws is amazing.
I never experienced wave so maybe the comparison is not a good one? It's indeed not a good one. Discord refined instant messaging and bolts other things on top like forums but isn't fundamentally different. Google Wave was (and still is) a completely different paradigm. Everything was natively collaborative: it mixed instant messaging with document edition (like Google Docs or pads) and any widget you could think of (polls, calendars, playing music, drawing, ...) could be added by users through sandboxed Javascript. The current closest I can think of is DeltaChat's webxdc. Is there a video or anything of this version of Wave? I haven’t found one showing what Chris showed. Most seem to focus on just communications with little demonstration of productivity or other features. This is sad to me because its most glorious asset was being open source with a rich set of plugins/extensions allowing tons of functionality. wave was fucking amazing. buggy but amazing Google sucked/s at executive function because they completely lack appreciation for proper R&D and long-term investment and also kill things people use and love. Honestly a lot of the time they seem to be be in "what do humans want?" mode. Yep. And rather than ask people, focus group, or look at the evidence, they just guess or do whatever they want. Not much leadership or community engagement appears to be involved. I don’t think that’s an entirely fair characterization. They obviously spend a great deal of time focusing on what their (human) shareholders want. Well, that's fair. Overpaid managers and principle engineers spun "secret projects" and products like Glass well to be an elitist experience for special people. But I won't forgive not letting Wave bake and mature. Q: Do they have non-human shareholders I don't know about, or do they have shareholders who lack qualities present in most living human beings? I’ve only met a few of their significant shareholders and based on that, I’d say the jury is still out. I remember being excited by wave when the demo hit but never had a use for what it offered at that point in my career. Adobe Flash / Shockwave. After all these decades, I've yet to see a tool that makes it as easy to make games or multimedia as Flash did. One of many reminders recently (many others in politics) that humanity doesn't just inevitably or linearly move forward in any domain, or even 2 steps forward 1 step back. Some things are just lost to time - maybe rediscovered in a century, maybe never. Enabling novice normies to make games was excellent, and I believe the whole game industry benefited from this resulting injection of fresh ideas. A lot of indy developers with fresh takes on what games could be got started this way. Zachtronics is one example of many that comes to mind right now. On the other hand, for every flash game made there were about ten thousands flash-based ads, and nearly as many websites that used flash poorly for things like basic navigation (remember flash based website dropdown menus?). And for a few years it seemed like every single restaurant with a website was using flash for the entire thing, the results were borderline unusable in the best cases. And let's not forget that as long as flash was dominant, it was choking out the demand to get proper video support into browsers. Flash based video players performed like dog shit and made life on Linux a real chore. This post reminded me about the good time I had watching Salad Fingers and Happy Tree Friends. "Na, na, nanana na". I wish Flash would have died sooner. It was a plague on the web, you couldn't zoom, select text, go back, just a black box ignoring everything about your web browser. Killing it was probably the best thing Jobs ever did. Flash players had zoom built in. And I believe there were textareas that allowed people to copy and paste text if they wanted, though it wasn't very common Flash was the last thing that got people excited for the Web generally Flash was the original web Excel (also Lotus 1-2-3) -- a simultaneous design + data + programming tool. These are terrible for maintainability, but excellent for usability. On the whole, I'd say it was easily a loss for the greater web that web programming left the citizen-programmer behind. (By requiring them all to turn into hamfisted front-end javascript programmers...) Many of the centralized evils of the current web might have been avoided if there had remained an onramp for the neophyte to really create for the web. I.e. Facebook et al. might have instead been replaced by a hosted, better-indexed Macromedia create + edit + host platform Or the amount of shit code produced by inexperienced front-end devs throwing spaghetti at IE might have been reduced This. Flash was awful. I see people defending it and I feel like I’m taking crazy pills. It was both awful when it showed up in the enterprise and amazing at unleashing creativity for many. Most young non-technical people I knew during its rise had regularly made Flash creations or even games, and deeply enjoyed the Cambrian explosion of games and animations for a few years. It was really meant for animation and games but got misused as a web GUI tool. I think it would've been fine to allow it anyway, and anyone who wants to build a GUI can just not use Flash. I dunno, a whole subtree of the internet died and I’m not sure it really came back. It was a beautiful Galápagos Islands. For the most part, people are talking about games and animation, not text based websites. Did you ever try one of those Flash-based room escape games? It was really amazing to lose yourself in the challenges and puzzles. The big issue with Flash was how overused it was. When Flash was on its way out one app made at the place I worked still said they needed it, and I couldn't figure out why... it was a Java app. After some digging, I found it, some horizontal dividers on the page. They could have, and should have, just been images. They didn't do anything. Yet someone made them in Flash. I'd also say all the drop-down menu systems were an overuse. Splash screens on every car company's home page. It was out of hand. I guess you could call it a victim of it's own success, where once it was time for it to die (due to mobile), very few people were sad to see it go. Godot is pretty awesome. Easy to learn, can do 2D or 3D, and can export to HTML5/webasm that works across all major OSes and browsers including mobile. It’s far from perfect but I’ve been enjoying playing with it even for things that aren’t games and it has come a long way just in the last year or two. I feel like it’s close to (or is currently) having its Blender moment. Even if Adobe had gotten their act together and fixed all security holes, Apple would have still killed it. It was always a threat as a popular design tool. And decades later, with the HTML canvas hype faded, there's still no replacement to what Adobe Flash could do - any designer could create stellar, interactive design that can be embedded into any website...without a monthly subscription. True, I do think Godot is on the right path, I haven’t had time to look into it in detail, but their HTML5 export seems solid from the videos I saw. Those tools were awesome. But as formats go, they were awful due to bad performance and more security holes than anything else. I still miss Macromedia Fireworks. > more security holes than anything else. yeah it wasn't secure but; > bad performance I don't think thats the case. For the longest while flash was faster than js at doing anything vaguely graphic based. The issue for apple was that the CPU in the iphone wasn't fast enough to do flash and anything else. Moreover Adobe didn't get on with jobs when they were talking about custom versions. You have to remember that "apps" were never meant to be a thing on the iphone, it was all about "desktop" like web performance. I remember well. I earned my living for a few years around 2010 porting slow Flash sites to regular web tech. It was hard to translate some functionality, but Flash was definitely slow compared to the equivalent regular website done without the plugin. Macromedia Fireworks was an outstanding piece of software. The 20 most common things you’d do with the tool were there for you in obvious toolbars. It had a lot of advanced features for image editing. It had a scripting language, so you could do bulk editing operations. It supported just about every file extension you could think of. Most useful feature of all was that it’d load instantly. You’d click the icon on the desktop, and there’d be the Fireworks UI before you could finish blinking. Compared to 2025 Adobe apps, where you click the desktop icon and make a coffee while it starts, it’s phenomenal performance. Performance was way better than what we have now with modern web stacks, we just have more powerful computers. I agree on security and bugs, but bugs can be fixed. It just shows neglect by Adobe, which was, I think, the real problem. I think that if Adobe seriously wanted to, it could have been a web standard. Lots of people say performance was good, but that seems to be through the nostalgic lens of a handful of cool games. Those did sometimes run really great, but most implementations were indeed very slow. I remember vividly because it was part of my job back then to help with web performance and when we measured page speed and user interface responsiveness flash was almost always the worst. Right. But that doesn't mean the performance of Flash was bad for what it was doing. Or that it was worse than the performance of doing the same thing in modern HTML+CSS now. The default, and by far the most common, output from Flash had significantly slower click-to-response and for network latency and for rendering than HTML+CSS is today. You remembering a few optimised instances does not change the reality that Flash was bad. You're still comparing Flash on twenty year old hardware to HTML+CSS on modern hardware. I am not and have never compared them in the way you say I did. You literally wrote ”Or that it was worse than the performance of doing the same thing in modern HTML+CSS now.” so I had to somehow repsond to that strange claim. Of course modern computers are orders of magnitude more powerful! But Flash was definitely generally worse compared on the same hardware and network stack compared to vanilla (non-plugin based) web tech. Flash performance is still better than current web stack's. Probably will always be - you could write non trivial games that would work on 128MB memory machine. Currently single browser tab with simple page can take more than that. > more security holes than anything else. Adobe was never known for its security or quality. Try Roblox! Unless you haven't yet. I was SO impressed. Everything works as expected. 5 minutes after starting the game making kit I totally understood why Roblox is worth billions. It just works. It's magic. All can be scripted, but also any 6y.o. can use it. Flash was the HyperCard of the 90s/early 2000s. There hasn’t been a replacement, yet. I was even fine with Flash being misused for web GUIs, just to pressure the open web to get its act together. At least devs got to pick 2 between [fancy, fast, easy]. If you want something better, make it instead of hobbling the competition. Personal pet peeve, but as someone who still makes gifs, Image Ready. Adobe kind of absorbed Image Ready into Photoshop and it's just never lived up to how easy it was to make simple gifs in Image Ready Yes. I never used flash personally, but I loved those little games people created with them. There was the whole scene of non developers creating little games of all kinds and it just ceased to exist. There is still a way to run flash apps via https://ruffle.rs/
You can probably still make flash games and run them via ruffle. Ruffle is amazing. I launched a 20+ year old game yesterday with zero compatibility issues. Even better than the original Flash because of superior security isolation mechanisms. Are there any ways that I can make games or something Like I want to make websites about me similar to those in neocities right, those flashy nice (good?) artistic UI I suck at css. I don't know but I never really got a feedback attention loop and heck even AI can make it better than me But I want to show the world what I myself can make as well and not just what I prompt or get back. I want a good feedback loop, can flash be useful for this purpose? Like maybe I want a website like uh something early browser times. I am kinda interested in building something like netscape navigator esque thing even though I wasn't born in that era or maybe windows xp style. I have mixed opinions about AI tbh. I genuinely just want to learn things right now, it might take me more time, I have been beating myself over using AI and not feeling equal to if writing things by hand. So I want to prove to myself that I can write things/learn things by hand as well. Like I tried using it to study but the lure to make things right away and then trapping you later is definitely there, it feels easy in the start imo and that's the lure and I kinda want to stay away with that lure to develop my skills, maybe not right now, then later. Flash is kind of dead now, i don't think the tools to create new Flash software are even released anymore. I would recommend learning Godot to make a game. There's some great tutorials like here - https://www.gdquest.com/library/first_2d_game_godot4_vampire... Kids now create games in Roblox.
More constrained, more commercial, more exploitative- but there is still a huge scene of non-developers creating games if you care to look. Lytro light field cameras.
The tech was impressive and the company was able to put two products on to the shelves, though unfortunately they hadn't quite reached the image quality needed for professional photographers. But now with the new Meta Ray-Bans featuring a light field display and with new media like gaussian splats we're on the verge of being able to make full usage of all the data those cameras were able capture, beyond the demos of "what if you could fix your focus after shooting" of back then. Beyond high tech, there's a big market for novelty kinda-bad cameras like Polaroids or Instax. The first Lytro has the perfect form factor for that and was already bulky enough that slapping a printer on it wouldn't have hurt. > unfortunately they hadn't quite reached the image quality needed for professional photographers. I always wondered about that - since it works by interleaving pixels at different focal depths, there's always going to be a resolution tradeoff that a single-plane focus camera wouldn't. It's such a cool idea though, and no more difficult to manufacturer than a sensor + micro lens array. In fact, the Lytro Illum (the big one) had a really nice, very flexible, bright super-zoom lens. If you ever wondered how that was achieved: having the microlens array and a light field sensor (1) allows relaxing so many aberration constraints on the lens that you could have a light, compact super-zoom. (1) it's not really different focal depths, it's actually more like multiple independent apertures at different spatial locations, each with a lower resolution sensor behind it - stereovision on steroids (stereoids?) Don't phones do this now? I remember Lytro cameras, they were really exciting. Phone cameras fake it. They don't capture a light field like Lytro did, they capture a regular image with a very deep depth of field, extract a depth map (usually with machine learning, but some phones augment it with stereoscopy or even LIDAR on high end iPhones) and then selectively blur based on depth. A lot of things on https://killedbygoogle.com/ . I used to use 30-40 Google products and services. I'm down to 3-4. Google Picasa: Everything local, so fast, so good. I'm never going to give my photos to G Photos. Google Hangouts: Can't keep track of all the Google chat apps. I use Signal now. Google G Suite Legacy: It was supposed to be free forever. They killed it, tried to make me pay. I migrated out of Google. Google Play Music: I had uploaded thousands of MP3 files there. They killed it. I won't waste my time uploading again. Google Finance: Tracked my stocks and funds there. Then they killed it. Won't trust them with my data again. Google NFC Wallet: They killed it. Then Apple launched the same thing, and took over. Google Chromecast Audio: It did one thing, which is all I needed. Sold mine as soon as they announced they were killing it. Google Chromecast: Wait, they killed Chromecast? I did not know that until I started writing this.. Google Reader: I will forever be salty about how Google killed something that likely required very little maintenance in the long run. It could have stayed exactly the same for a decade and I wouldn't have cared because I use an RSS reader exactly the same way I do that I did back in 2015. Yes. That was the single worst business decision in Google history, as somebody correctly noted. It burned an enormous amount of goodwill for no gain whatsoever. Killing Google Reader affected a relatively small number of users, but these users disporportionately happened to be founders, CTOs, VPs of engineering, social media luminaries, and people who eventually became founders, CTOs, etc. They had been painfully taught to not trust Google, and, since that time, they didn't. And still don't. Just think of the data mining they could have had there. They had a core set of ultra-connected users who touched key aspects of the entire tech industry. The knowledge graph you could have built out of what those people read and shared… They could have just kept the entire service running with, what, 2 software engineers? Such a waste. This would require the decision-maker to think and act at the scale and in interests of the entire company. Not at the scale of a promo packet for next perf: "saved several millions in operation costs by shutting down a low-impact, unprofitable service." Yes, Google killing Reader was probably the first time they killed a popular product and what started the idea that any Google product could be killed at any time. There is some truth in this. I fit into a few of these buckets and I don’t think I could ever recommend their enterprise stuff after having my favourite consumer products pulled. I never understood why noone built a Copycat (like "bgr" -> "better google reader :-D)
There would have been a clear change to fill this vacuum? The thing is: I guess they didnt see a good way to monetize it (according to their "metrics"), while the product itself had somehow relative high OpEx and being somehow a niche thingy. Yes! I loved this product… it was our little social network for my friends and coworkers. > Google Play Music: I had uploaded thousands of MP3 files there. They killed it. I won't waste my time uploading again. You can argue whether it's as good as GPM or not, but it's false to imply that your uploaded music disappeared when Google moved to YouTube Music. I made the transition, and all of my music moved without a new upload. YouTube Music isn't available in all countries which Google Play Music was available in. My music was deleted. You made the transition, under differing licensing terms. Not always an option. Chromecast Audio still works! They just don't sell them anymore. I use mine every day, and have been keeping an eye out for anyone selling theirs... Hmm, good to know. But given Google's history, I assumed that it would stop working. I also need to sell my Google Chromecast with Google TV 4K. Brand new, still in its shrink wrap. Bought it last year, to replace a flaky Roku. It was a flaky HDMI cable instead. I trust Roku more than Google for hardware support. In absolutely shocking news, it did stop working and then Google went out of their way to fix it. I genuinely thought all the chromecast audios I owned were useless bricks and was looking around for replacements and then they just started working again from an OTA update. Astounding. I assume someone got fired for taking time away from making search worse to do this. (edit: https://www.techradar.com/televisions/streaming-devices/goog...) They are still selling their remaining stock and vowed to keep supporting it with bug fixes and security updates: https://blog.google/products/google-nest/chromecast-history/ Of course another question how long they will honor that commitment. Picasa was awesome, they had face recognition years before almost everything else, in a nice offline package. Unfortunately the last public version has a bug that randomly swaps face tags, so you end up training on the wrong persons faces just enough to throw it all off, and the recognition becomes effectively worthless on thousands of family photos. 8( Digikam is a weak sauce replacement that barely gets the job done. Immich is supposed to solve this nowadays: https://github.com/immich-app/immich Immich is the closest thing I've found to Picasa. However, I would just point out you can still download and use Picasa 3.9 on Windows. I'm still amused that they killed Google Notebook and then a few years later created Google Keep, an application with basically the same exact feature set. You can say that for a fair few of the services mentioned by GP. Google killed a lot of things to consolidate them into more "integrated" (from their perspective) product offerings. Picasa -> Photos, Hangounts -> Meet, Music -> YT Premium. No idea what NFC Wallet was, other than the Wallet app on my phone that still exists and works? The only one I'm not sure about is Chromecast - a while back my ones had an "update" to start using their newer AI Assistant system for managing it. Still works. I still use PICASA it works fine. However, when google severed the gdrive-photo linking it meant my photos didn’t automatically download from google to my PC. This is what killed google for me. Google Search: Not officially dead yet, but.... yup, losing 0.000087% year-over-year so in 865 billion years it’ll be dead :) That was probably me, when I stopped using Google Search some years ago. :-) Got tired of the ads, the blog spam, and AI-generated content crap floating to the top of their results page. The https://udm14.com/ flavor of Google is quite usable, though, esp with notable operators like inurl:this-or-that. But, all in all, yeah, gimme back vanilla Google search from 2008-2010 or so. Back then it was definitely a tool (I worked in investigative journalism at the time), whereas currently "searching" stands for sitting fingers crossed and hoping for the better. But, oh well. </rant> Kagi has been a great replacement for me. Less blogspam I've found, plus it doesn't give me AI results unless I explicitly tell it I want AI results by adding a "?" to the end of my query. That's more what I meant. Sure, lots of people still type stuff into the URL bar that takes them to www.google.com/search. But whatever you want to call that results page now, it's no longer Google Search in anything but name. same can be said if you compare www.google.com search from 2012 and 2022, times are changing… I am not defending google search here, I haven’t used it except by accident in long time now but to say google search is “dying” like you often hear (especially here on HN) is a serious detachment from reality How did you go bankrupt? Two ways. Gradually, then suddenly. - Ernest Hemingway, The Sun Also Rises I guess I’ve heard it all now… Google going bankrupt would not have made Top-1 Million list of likely things to read on Sunday morning… Google G Suite offered a free option after initially saying it was ending. just logged into my Workspace account: https://ibb.co/99jBLJnD still have many domains on there, all with gmail I'm still upset that Google Maps no longer tracks my location. It was very useful to be able to go back and see how often and where I had gone. Is there another app where I can store this locally? Google Maps still tracks my location. The difference is they no longer store the data on their servers, it's stored on your phone (iPhone/Android) https://support.google.com/maps/answer/6258979 That way, they can't respond to requests for that data by governments as they don't have it. I can look on my phone and see all the places I've been today/yesterday, etc I use this free and extremely bare bones app made by a friend: https://apps.apple.com/us/app/max-where/id1579123291. It tracks your location constantly, has a basic viewer, and lets you export to CSV. That’s about it but it’s all I need. Check out Dawarich, it has an official iOS app and you can use a number of 3rd party mobile apps to track your data and then upload it to server: either ran on your own hardware (FOSS self-hosted) or to the Dawarich Cloud one: https://dawarich.app Using it on daily basis Arc and its free Arc Mini companion. iOS. Been using it since Facebook eclipsed Moves app. A decade later, it's still not as good as Moves. I heard about dawarich, open source, didn’t have time to try it though or check the details... https://dawarich.app/ Strava? :-) Half-joking, half-serious, I haven't used Strava in years, I don't remember all its capabilities. Edit: Missed the "locally" part. Sorry no suggestions. Maybe Garmin has something? Nope, Garmin only tracks your location when you record an activity that uses gps, which is good, frankly. Add Google Podcasts to the list. I switched to AntennaPod. Youtube Music has too noisy an interface. > Google Chromecast: Wait, they killed Chromecast? I did not know that until I started writing this.. They have something called Google TV Streamer now, so for me it's more of a rebrand than really killing a product. Except Google TV isn't the same. You can cast to it, but it's more akin to a Roku - it comes with a remote and has "channels" you install. Oh, and a metric crapton of ads it shows you. I’m still using
- free g suite
- play music
- finance
- nfc wallet is just google wallet isn’t it?
- chromecast, video and audio-only
I guess play music is now YouTube music, and doesn't have uploads, so that can be considered dead, but the others seem alive to me. > Google Hangouts: Which particular thing called Hangouts? There were at least two, frankly I’d say more like four. Google and Microsoft are both terrible about reusing names for different things in confusing ways. > Can't keep track of all the Google chat apps. And Hangouts was part of that problem. Remember Google Talk/Chat? That was where things began, and in my family we never wanted Hangouts, Talk/Chat was better. Allo, Chat, Duo, Hangouts, Meet, Messenger, Talk, Voice… I’ve probably forgotten at least two more names, knowing Google. Most of these products have substantial overlap with most of the rest. I used Picasa and loved it, until I realized I want all my photos available from all my devices at all times and so gave in to Google Photos (for access, not backup) I use SyncThing for that purpose. It syncs across my phone, my laptops, and my Synologies. But I don't sync all my photos. I don't like the thought of providing Google thousands of personal photos for their AI training. Which will eventually leak to gov't agencies, fraudsters, and criminals. Google Desktop Search (and also the Search Appliance if you were an SMB). I think Chromecast has been replaced by Google TV which is a souped up Chromecast. Picasa definitely went against the grain of Google, which is all about tying you to online services. Hangouts had trouble scaling to many participants. Google Meet is fine, and better than e.g. MS Teams. Legacy suite, free forever? Did they also promise a pony?.. Play Music: music is a legal minefield. Don't trust anybody commercial who suggests you upload music you did not write yourself. Finance: IDK, I still get notifications about the stocks I'm interested in. NFC Wallet: alive and kicking, I use it literally every day to pay for subway. Can't say anything about Chromecast. I have a handful of ancient Chromecasts that work. I don't want any updates for them. Why did you keep on using so many Google products if those products get cancelled? Why didn’t you quit Google after, say, the third product you used got canned? I used Google Talk than Hangouts, but once they switched to Meet, I gave up on them. By then my family was all using Hangouts, and we never settled on a new service, because one of my siblings didn't want to support any chat services that don't freely give user information to the government, and the rest of us didn't want to use a chat platform that does freely give user information to the government. Am I the only one salty about Google Podcasts? For me that was the straw that broke the camel’s back… I dropped Android, switched to iOS, and slowly phasing out the Google products in my life. Isn't it "Google TV Streamer" now? From what I can tell (since I am just finding out about this today), they stopped manufacturing the old Chromecast hardware, and at some point, will stop supporting the old devices. The old devices may stop working in the future, for example, because they sunset the servers. Like their thermostats. Who knows? I wish there was some law that requires open-sourcing firmware and flashing tools if a company decides to EOL a product ... Optane persistent memory had a fascinating value proposition: stop converting data structures for database storage and just persist the data directly. No more booting or application launch or data load: just pick up where you left off. Died because it was too expensive, but probably long after it should have. VM's persist memory snapshots (as do Apple's containers, for macOS at least), so there's still room for something like that workflow. 1+ for 3dxpoint. The technology took decades to mature, but the business people didn’t have the patience to let the world catch up to this revolutionary technology. The world had already caught up. By the time it was released, flash memory was already nearing it's speed and latency, to the point that the difference want with the cost. >flash memory was already nearing it's speed and latency Kinda, but for small writes it's still nowhere near. Samsung 990 Pro - IOPS 4KQD1 113 MBytes/Sec P4800X optane - IOPS 4KQD1 206 MBytes/Sec And that's a device 5 years newer and on a faster pcie generation. It disappeared because the market that values above attribute is too small and its hard to market because at first glance they look about the same on a lot of metrics as you say Systems are stuck in old ways in how they model storage, so they weren't ready for something that is neither really RAM nor disk.
Optane did inspire quite a few research projects for a while though. A few applications emerged in the server space, in particular. Optane was impressive from tech standpoint. We were about get rid of split between RAM and disk memory and use single stick for both! I have an optane drive with the kernel on it, instant boot! How does that work? It loads kernel from drive to ram? Isn't windows fast boot something like that (only slower, depending on ssd)? It semi-hibernates, stores kernel part of memory on disk for faster startup. This one would have behaved more like suspend to RAM. In suspend to RAM, the RAM is kept powered, while everything else is shut down. The recovery would be near instant, since all the execution contexts are preserved on the RAM. Optane was nearly as fast as RAM, but also persistent like a storage device. So you do a suspend to RAM, without the requirement to keep it powered like a RAM. Not only because of price. The 'ecosystem' infrastructure wasn't there, or at least not spread wide enough. The 'mindshare'/thinking of ways how to do, neither. This is more aligned with (live) 'image-based' working environments like early Lisp and Smalltalk systems. Look at where they are now... A few more thoughts about that, since I happen to have some of the last systems who actually had systems level support for that in their firmware, and early low-capacity optanes designed for that sort of use. It's fascinating to play with these, but they are low capacity, and bound to obsolete operating systems. Given enough RAM, you can emulate that with working suspend and resume to/and from RAM. Another avenue are the ever faster and larger SSDs, in practice, with some models it makes almost no difference anymore, since random access times are so fast, and transfer speeds insane. Maybe total and/or daily TBW remains a concern. Both of these can be combined. The Ricochet network. A packet mesh network providing ISDN speeds in the dialup era, wirelessly. They burned through $5B of 1999 dollars, building out a network in 23 cities, and had effectively zero customers. Finally shut down in 2001. All their marketing was focused on "mobile professionals", whoever those were, while ignoring home users who were clamoring for faster internet where other ISPs dragged their feet. Today, 5G femtocells have replicated some of the concept (radically small cell radius to increase geographic frequency reuse), but without the redundancy -- a femtocell that loses its uplink is dead in the water, not serving as a relay node. A Ricochet E-radio that lost its uplink (but still had power) would simply adjust its routing table and continue operating. Ricochet was super cool. Way ahead of its time. There's even a Joel blog post about it: https://www.joelonsoftware.com/2000/12/20/the-ricochet-wirel... I loved my Ricochet modems so damn much. Sitting in a coffeeshop in Palo Alto with an Apple Powerbook and a second generation Ricochet modem rocking web browsing and ssh sessions at 56k when wifi was unknown to the general public. I still have a couple in a box somewhere and I am tempted to see if I can get them into star mode. I can totally see clabretro or cathode ray dude doing a video with you on this I had a Ricochet modem in '98-99 living in San Francisco. Just 10 years later the iPhone was launched, on 3G networks that had integer multiples better performance. How would I have been better off had Ricochet survived? This seems like a place where technological progress went --- extremely --- in the right direction. Wow, I forgot about this. It was surprisingly great for the time. Apparently I was one of their 4 customers, too! Midori, Microsoft's capability-based security OS[1]. Rumor has it that it was getting to the point where it was able to run Windows code, so it was killed through internal politics, but who knows! It was the Fuchsia of its time... [1] https://en.wikipedia.org/wiki/Midori_%28operating_system%29 Midori was fascinating. Joe Duffy's writing on it is the most comprehensive I've seen: https://joeduffyblog.com/2015/11/03/blogging-about-midori/ I've heard someone at Microsoft describe it as a moonshot but also a retention project; IIRC it had a hundred plus engineers on it at one time, including a lot of very senior people. Apparently a bunch of research from Midori made it into .NET so it wasn't all lost, but still... > retention project Never heard this phrase before, but I can definitely see this happening at companies of that size Where did you hear it could run Windows code? Everything known about Midori publicly says the opposite, it was specifically designed at every point to be totally incompatible with all existing code. Maybe a few people on the Midori team fantasized about a migration path but it was never going to happen. Midori was designed from the start without migration in mind. The technical foundation seems interesting, but knowing Microsoft this would have just become yet another bloated mess with it's own new set of problems. And by now it would have equally become filled with spyware and AI "features" users don't want. Have you come across Genode (https://genode.org)? It's kind of in that space, and is still actively developed. Yahoo pipes. It was so great at creating rss feeds and custom workflows. There are replacements now like Zapier and n8n but loved that. Also google reader which is mentioned multiple times already. Definitely recommend reading https://retool.com/pipes about the history of Pipes, with lots of input from the people who worked on it (It’s not super obvious, especially on mobile, but once you see the site, just scroll down to see the content) Yahoo Pipes was what internet should have been. We're so many decades into computing and that kind of inter-tool linking has only barely been matched by unix pipes. Many companies are working very hard to make that impossible unfortunately. For example you can't get posts from public Facebook groups automatically, although that would be a really good source candidate. They used to allow it, but... not anymore. I never used it, but Yahoo pipes sounds like it was awesome whenever I hear people talk about it. I don't know if it was Yahoo Pipes that died, or a mainstream internet based on open protocols and standards. It died because it was basically a cool hobby tech demo that happened to be on yahoo domain. There was never any real tie in to yahoo the company I loved pipes. I had rss feeds from all the sites where I was sharing content collected up and formatted via pipes into a single rss feed that was pulled into a php blog. Then all those sites I used to post on stopped supporting rss one by one and finally pipes was killed off. For a while I used a python library called riko that did the same thing as pipes without the visual editor. I have to thank it for getting me off php and into python. If anyone with time, money and resources wants to revive the ideas of Yahoo! Pipes then I would suggest using Node-RED[^1] as a good starting point. It has the advantage of being open source, has well defined and stable APIs and a solid backend. Plus 10+ years of constant development with many learnings around how to implement flow based programming visually. I used the Node-RED frontend to create Browser-Red[^2] which is a Node-RED that solely executes in the browser, no server required. It does not support all Node-RED functionality but gives a good feel for using Node-RED and flow based programming. The second project with which I am using Node-RED frontend is Erlang-Red[^3] which is Node-RED with an Erlang backend. Erlang is better suited to flow based programming than NodeJS, hence this attempt to demonstrate that! Node-RED makes slightly different assumptions than Yahoo! Pipes - input ports being the biggest: all nodes in Node-RED have either zero or one input wires, nodes in Yahoo! Pipes had multiple input wires. A good knowledge of jQuery is required but that makes it simpler to get into the frontend code - would be my argument ;) I am happy to answer questions related to Node-RED, email in bio. [^1]: https://nodered.org [^2]: https://cdn.flowhub.org Ah, this would get my vote too. I've seen a few attempts since, but I think you needed that era of "throw lots of money at any idea" to get it off the ground again. I missed Yahoo Pipes a lot so I built something similar recently for myself :) I know there are a few alternatives out there, but had to scratch my own itch. I can recommend Apache Camel (https://camel.apache.org) for similar data integration pipelines and even agentic workflows. There are even visual editors for Camel today, which IMHO make it extremely user friendly to build any kind of pipeline quickly. Apache Karavan: https://karavan.space/
Kaoto (Red Hat): https://kaoto.io Both are end 2 end usable within vscode. Prodigy (the online service). I'm not saying I wish it was still alive, but it contained some amazing technology for the time (mid- to late 1980s), much of which is now present in web tech: - Client software that ran a VM which received "objects" from a central server (complete with versioning so it would intelligently download new objects when necessary). Versions were available for IBM (DOS), Windows, and Mac. Think of it as an early browser. - Multiple access points and large internal network for storing and delivering content nationwide. This was their proprietary CDN. - Robust programming language (TBOL/PAL) for developing client-side apps which could also interact with the servers. Just like Javascript. - Vector (NAPLPS) graphics for fast downloading (remember, Prodigy started in the days when modems maxed out at 1200 baud); later they added JPG support. - Vast array of online services: shopping, banking, nationwide news, BBSes, mail (before Internet email was popular), even airline reservations. All this was run by a partnership between IBM, Sears, and CBS (the latter dropped out early). They were the Google of the time. Vine. It was already pretty big back in 2013
but
Twitter had no idea what to do with it. TikTok actually launched just a few months before Vine was shut down and erased from the internet. Whoever took the decision to kill Vine was an absolute moron, even without hindsight. It was square videos, how hard could it have been to shove an ads banner above it and call it a day? Incredible They also killed Periscope right as the explosion of streaming online video happened... Twitter has always been pretty incompetent. There's an excellent write up at
https://www.washingtonpost.com/technology/2023/09/28/extreme... That is so fascinating. They completely ignored their most most valuable users and thus the users left and the site collapsed. Fascinating, the hubris of the leadership at twitter to think they knew better than their users I will never forgive twitter for this catch and kill of a platform so full of life Perhaps because they already had Periscope that no one used. It was a "buy competitor to kill it" play that didn't have the desired effect. Amusingly Periscope was their clone of Meerkat which was briefly popular before they killed it. Periscope was in closed beta when Meerkat launched. Neither was a clone of the other. Just two teams with the same idea at the same time. I've thought about this too. Imagine all the drama the US government could've avoided if Vine had won over TikTok! With Elon running it? He probably would have actively sold it to china. In a world where Vine is as successful as TikTok ended up being, who’s to say they get to a point where selling to Musk even happens? guys when you invent fictional alternate realities, you're allowed to leave people out of them completely. Anyone you like. Google Stadia. They had built a solid streaming platform for low latency cloud gaming but failed hard on actually having interesting games to play on it. You just can't launch a gaming platform with a handful of games that have been available everywhere and expect it to succeed. Heroku? I know it's still around, though IDK who uses it, but I miss those days when it was thriving. One language, one deployment platform, one database, a couple plugins to choose from, everything simple and straightforward, no decision fatigue. I often wonder, if AI had come 15 years earlier, would it have been a ton better because there weren't a billion different ways to do things? Would we have ever bothered to come up with all the different tech, if AI was just chugging through features efficiently, with consistent training data etc.? As soon as they put a persistent Salesforce brand banner across the top which did nothing but waste space and put that ugly logo in our face every day, my team started our transition off Heroku pretty much right away. > One language, one deployment platform, one database, a couple plugins to choose from, everything simple and straightforward, no decision fatigue. Sounds not that different from containers, if you just choose the most popular tooling. Small projects: docker compose, posgres, redis, nginx Big projects: kubernetes, posgres, redis, nginx This is why Heroku lost popularity. Was going to say, I still use Heroku, and it's been working ok, but I'm getting increasingly creepy vibes from it and fear that it could be abandoned. Starting of course with Salesforce acquisition. My company still uses Heroku in production actually. Every time I see the Salesforce logo show up I wince, but we haven't had any issues at all. It continues to make deployment very easy. I talked to some Heroku reps at a local tech conference a year or so ago; it was clear that they were instructed to not have any personal opinions of the shredding of the free tier, but they did admit in a roundabout way that it lost them a lot of customers - some they were glad to get rid of as they were gaming the goodwill and costing Heroku lots of money, but weren't sure if it was a good long term idea or not. > One language, one deployment platform, one database, a couple plugins to choose from, everything simple and straightforward, no decision fatigue. I feel like this also describes something like Vercel. Having never personally used Heroku, is Vercel all that different except Ruby vs JS as the chosen language? Didn't they offer free compute? IIRC all free compute on the Internet went away with the advent of cryptocurrencies as it became practical to abuse the compute and translate it directly into money. I use the core product for my SaaS apps. Great platform, does what it needs to do. Haven’t felt the need to switch. Sometimes tempted to move to a single VPS with Coolify or Dokku, but not interested in taking on the server admin burden. What are the reasons that make you want to migrate away? Cost, flexibility, support..? I think their main failure points were the following: - not lowering prices as time went off. They probably kept a super-huger margin profit, but they’re largely irrelevant today - not building their own datacenters and staying in aws. That would have allowed them to lower prices and gain even more market share. Everyone that has been in amazon/aws likely has seen the internal market rate for ec2 instances and know there’s a HUGE profit margin deriving by building datacenters. Add the recent incredible improvements to compute density (you can easily get 256c/512t and literally terabytes of memory in a 2u box) and you get basically an infinite money glitch. Pascal/Delphi - especially in the educational context. Crazy fast compiler so doesn't frustrate trial & erroring students, decent type system without the wildness of say rust and all the basic programming building blocks you want students to grasp are present without language specific funkiness. Delphi isn't dead - ver 13 was recently released - https://www.embarcadero.com/products/delphi. It's even cross platform, uses Skia as its graphics engine, its all very nice. Check out Lazarus (https://www.lazarus-ide.org/) an open-source spiritual successor to Delhi's development environment. Apparently Python is now the language of choice for teaching programming, and I'm a bit worried about it because the type system is a mess. I think Pascal or ADA are better language to start learning about types with a good base. Iirc Delphi didn’t have threads, sockets, or OS integration (signals, file watching …). So it wasn’t suited to systems programming ie servers and services. It nailed gui applications, and that was a lot. Maybe freepascal has threads and sockets but imo it was too late. Delphi 2, the first 32bit version of Delphi, had all of this. Some, like threads, even had wrappers (TThread), but Delphi came with Win32 bindings out of the box so all Win32 functions were available too - and it came bundled with documentation for the APIs. In addition, calling out to a DLL was trivial so even if a function wasn't available, you could just define it. Pretty much anything you could do with a C compiler was possible with Delphi 2 too. Free Pascal obviously has all of that stuff too. Not sure earlier versions, but Delphi 5 (~1999) definitely had all those. Plausible that it was added much later than in C/C++ world though Maybe not in the earliest versions, but by the late 90s, when I learned it, it certainly had those things. I have written a number of services in Delphi, some 20 years ago, all works fine. Eh, sounds like that wouldn't be a problem for education purposes as the parent suggests? You need to be doing some really specific to leverage threads/file watching. And people probably use C to teach threads anyway. Of course, being a good teaching language probably doesn't make the language popular or even survive. Python is so widely used not necessarily because it's simple to learn but because of its ecosystem. Positron - Firefox version of Electron. "Electron-compatible runtime on top of Gecko" https://github.com/mozilla/positron This would have changed so much. Desktop apps powered by the engine of Firefox not Chrome. Why? Not enough company buy in, not enough devs worked on it. Maybe developed before a major Firefox re-write? You may be happy to hear that the new Fedora installer is using Firefox under the hood. Ephemeral profile dir on startup, plus custom userChrome.css to hide most of Firefox UI, and I couldn't tell a difference between it and Electron. https://github.com/rhinstaller/anaconda-webui I wish RedHat made an easy-to-use framework out of it. With more of Firefox's rendering migrating to Rust, there's got to be a market for a memory safe alternative to Electron now. I referred to secure rendering. Tauri apps take advantage of the web view already available on every user’s system. A Tauri app only contains the code and assets specific for that app and doesn’t need to bundle a browser engine with every app. Rendering will still use Edge/Chromium on a generic Windows machine. Looking at firefox memory usage, i’m afraid the issue there is not memory safety but rather the average javascript developer being completely and blissfully unaware of and careless about memory memory usage of the software they write Quartz Composer - Apple's "patch-based" visual programming environment. Drag out a bunch of nodes, wire them together, build a neat little GUI. 10+ years ago I'd regularly build all sorts of little utilities with it. It was surprisingly easy to use it to tap into things that are otherwise a lot more work. For instance I used it to monitor the data coming from a USB device. Like 3 nodes and 3 patches to make all of that work. Working little GUI app in seconds. Apple hasn't touched it since 2016, I kind of hope it makes a comeback given Blender and more so Unreal Engine giving people a taste of the node based visual programming life. You can still download it from Apple, and it still technically works but a lot of the most powerful nodes are broken in the newer OS's. I'd love to see the whole thing revitalized. I loved quartz composer. It made it really easy to build all sorts of motion graphics. I’d see it used a lot at gigs to create audio-driven visuals. There was even a pretty cool VJ app built on it. I’ve tried things like Touch Designer and Max MSP but they’re too heavy to just pick up and play with. QC was the right balance between simplicity and power. > Quartz Composer Have you looked at https://vvvv.org/ ? Maybe it's still comparatively too heavy but imho it's not that heavy (cf. touch designer and the likes). I want to play with it some more myself... Sandstorm: it seemed quite nice with a lot of possibilities when it launched in 2014, but it didn’t really take off and then it moved to sandstorm.org. The creator, kentonv (on HN), commented about it recently here https://news.ycombinator.com/item?id=44848099 The actual problem with Sandstorm wasn't the era in which it was released. It will probably have the same problems even if released today. The problem was its application isolation mechanism - especially the data isolation (I think they were called grains). The mechanism is technically brilliant. But it's a big departure from how apps are developed today. It means that you have to do non-trivial modifications to web applications before they can run on the platform. The platform is better for applications designed to run on it in the start. It should have been marketed as a platform for building web applications, rather than as one for just deploying them. Agreed. The best apps turned out to be the ones written for the platform. And many of those took people an afternoon to write, since the platform handled so much for you. Porting "normal" apps into Sandstorm felt like it defeated the purpose. If I did it again I wouldn't focus on portability of existing apps. Especially today given you could probably vibe code most things (and trust the sandbox to protect you from AI slop security bugs). Sandstorm was a great idea, but in my opinion it was targeted wrong. It should have been a platform and marketplace for B2B SaaS, not B2C SaaS. Specifically, all the third-party services which typical web apps use could have been Sandstorm apps, like analytics, logging, email, customer service etc. ReactOS, the effort to create a free and open source Windows NT reimplementation. It has been in existence in some form or another for nearly 30 years, but did not gain the traction it needed and as of writing it's still not in a usable state on real hardware. It's not abandoned, but progress on it is moving so slow that I doubt we'll ever see it be released in a state that's useful for real users. It's too bad, because a drop in Windows replacement would be nice for all the people losing Windows 10 support right now. On the other hand, I think people underestimate the difficulty involved in the project and compare it unfavorably to Linux, BSD, etc.
Unix and its source code was pretty well publicly documented and understood for decades before those projects started, nothing like that ever really existed for Windows. They had no chance. Look how long it tooks for Wine to get where they are. Their project is Wine + a kernel + device drivers compatibility, and a moving target. ReactOS right now focuses on Windows XP era hardware and compatibility, also not guaranteed to work outside a VM. > ReactOS, the effort to create a free and open source Windows NT reimplementation. Some projects creep along slowly until something triggers an interest and suddenly they leap ahead. MAME's Tandy 2000 implementation was unusable, until someone found a copy of Windows 1.0 for the Tandy 2000, then the emulation caught up until Windows ran. Maybe ReactOS will get a big influx of activity after Windows 10 support goes offline in a couple days, or even shortly after when you can't turn AI spying off, not even three times a year. Not so long ago there was a leak of windows’ source code, up to xp and 2003 server… the leak was so complete there are videos on YouTube about people building and booting (!!!) windows from there. And yet, no big leap in ReactOS (at least for now). IIRC ReactOs forbids you from contributing if you had access to the windows source code in some way shape or form. They need to train an LLM with the windows source code and ask it to write an windows clone. Apparently copyright law only applies for humans, generative AI gets away with stealing because there is too much monetary interest involved in looking the other way. Wow, so you're saying "Windows but with even less reliability and more security problems plus tech debt"? I don't think the world really needs that. :) Leaks like this actually slow down ReactOS development. The project is supposed to be a clean-room reverse engineering effort. If you even see Windows code, you are compromised, and should not work on ReactOS. Wine, Proton and virtualization all got good enough that there's no need for a half-baked binary-compatible Windows reimplementation, and I think that took a lot of the oxygen out of what could have been energy towards ReactOS. It's a cool concept but not really a thing anybody requires. The easiest way to avoid patent liabilities is to always be 20 years behind. 20 years behind get us back to Windows XP, that had a better experience than Windows 11 anyway. I've heard people say this, and believed it myself for a long time, but recently I set up a windows XP VM and was shocked by how bad the quality of life was. I think nostalgia is influencing this opinion quite a bit, and we don't realize the mountain of tiny usability improvements that have been made since XP > I think people underestimate the difficulty involved in the project I don't think people do, it sounds like a nearly impossible struggle, and at the end you get a Windows clone. I can't imagine hating yourself enough to work on it for an extended period of time for no money and putting yourself and your hard work in legal risk. It's a miracle we have Wine and serious luck that we have Proton. People losing Windows 10 support need to move on. There's Linux if you want to be free, and Apple if you still prefer to be guided. You might lose some of your video games. You can still move to Windows 11 if you think that people should serve their operating systems rather than vice versa. "putting yourself and your hard work in legal risk" Like what? I'm genuinely curious what personal risks faces anyone from contributing to ReactOS. I also am curious what kind of legal risk may threaten the work? I mean, even in the unlikely scenario that something gets proven illegal and ordered to be dismissed from the project, what would prevent any such particular expunged part to be re-implemented by some paid contractor (now under legally indisputable circumstances), thus rendering the initial effort (of legal action) moot? The information superhighway The internet before advertising, artificial intelligence, social media and bots. When folks created startups in their bedrooms or garages. The days when google slogan was “don’t be evil”. That part of the Internet still exists, it's just nobody visits those sites anymore. Communities are moving back to early Internet-like chatrooms like IRC, but now it is Slack, Discord, and the like. Everything private. I really miss the like 8 year ago push where a lot of major projects were moving to IRC. It's too bad Freenode took the opportunity to jump the shark and killed the momentum. I mean, they're intentionally buried in the name of capital. If you need more than a Google search to find them, of course no one will go to them. I don't like the siloing our information to Discord being a comparison to old internet. We had indexable information in forums that is "lost", not in the literal sense, but because you wouldn't be able to find it without obsessive digging to find it again. Conversations in Discord communities are very surface level and cyclical because it's far less straight forward to keep track of and link to answers from last week let alone two years ago. It is profoundly sad, to be honest. I guess my abandoned/dead project might be Usenet. Sure, there were very dark places, and a lot of it was just a way to distribute porn, but that pretty much describes the Web. Usenet was like Reddit not controlled by a single company; like the Fediverse with infinite channels; like all of the world's threaded web fora displayed in exactly the way you want. We had that in the 1990s, and we're slowly groping toward getting it back. AKA "back when Marc Andreessen had hair and not enough money to build an apocalypse bunker on a personal island." And when no one knew you were a dog and neither did they care. Animated gifs of cat, banner bars and pixels cost one dollar, until a one million were sold. And it all ran on Chuck Norris' personal computer. Kuro5hin https://wikipedia.org/wiki/Kuro5hin I was a hold out on smartphones for a while and I used to print out k5 articles to read while afk... Just such an amazing collection of people sharing ideas and communal moderation, editing and up voting. I learned about so many wierd and wonderful things from that site. The candle that burned twice as bright was really adequacy.org, which sourced their trolls from Kuro5hin. (Archive here: https://www.inadequacy.org/) Rusty Foster (creator of Kuro5hin) is still writing!
https://www.todayintabs.com/ Are there any spiritual successors? Similar to jlokier's response, I ended up here after k5 went away. HN fills my geek interests pretty well, and over the last few years I've found that "long form video essays" on YT, audiobooks / podcasts fill my desire for learning about other random topics. Microsoft Silverlight. Full C# instead of god forbidden js. Full vector dpi aware UI, with grid, complex animation, and all other stuff that html5/css didn’t have in 2018 but silverlight had even in 2010 (probable even earlier). MVVM pattern, two-way bindings. Expression Blend (basically figma) that allowed designers create UI that was XAML, had sample data, and could be used be devs as is with maybe some cleanup. Excellent tooling, static analysis, debugging, what have you. Rendered and worked completely the same in any browser (safari, ie, chrome, opera, firefox) on mac and windows If that thing still worked, boy would we be in a better place regarding web apps. Unfortunately, iPhone killed adobe flash and Silverlight as an aftermath. Too slow processor, too much energy consumption. I am happy this one died. It was just another attempt by Microsoft to sidestep open web standards in favor of a proprietary platform. The other notorious example is Flash, and both should be considered malware. Open web standards are great but consider where we could have been if competition drove them a different way? We're still stuck with JavaScript today (wasm still needs it). Layout/styling is caught up now but where would we be if that came sooner? > Open web standards are great but consider where we could have been if competition drove them a different way? We're still stuck with JavaScript today (wasm still needs it). Layout/styling is caught up now but where would we be if that came sooner? Why do you think JavaScript is a problem? And a big enough problem to risk destroying open web standards. It's not that it's a problem I just don't think it's the best place to be. It was not designed to be used like this. Yes, it's better now but it's still not great - you still ship JS as text blobs that need to be parsed and compiled by every browser. I don't see how alternatives to JavaScript are a risk to open web standards. WebAssembly is itself a part of those same standards. It's just a shame that it was built as an extension of JavaScript instead of being an actual alternative. The same reason Typescript exists > The same reason Typescript exists TypeScript exists for the same reason things like mypy exists, and no one in their right mind claims that python's openness should be threatened just because static typing is convenient. Though in principle they serve similar purposes there are some big differences though. Python with types is still just python. Typescript is a different language from JS (guess it a superset?) and it being controlled by a large company could be considered problematic. I suppose JS could go in the same direction and adopt the typing syntax from TS as a non-runtime thing. Then the typescript compiler would become something like mypy, an entirely optional part of the ecosystem. Flash & Silverlight were both ahead of the current open web standards at the time. They also didn't suffer as much from the browser wars. Flash's ActionScript helped influence changes to modern JS that we all enjoy. You sometimes need alternative ideas to promote & improve ideas for open web standards. What web standards? :-) Stuff like angularjs was basically created for the same reason flash/silverlight went down — iphone Did Silverlight have the same security issues as Flash? Yes, even using C# couldn't save them. > A remote code execution vulnerability exists when Microsoft Silverlight decodes strings using a malicious decoder that can return negative offsets that cause Silverlight to replace unsafe object headers with contents provided by an attacker. In a web-browsing scenario, an attacker who successfully exploited this vulnerability could obtain the same permissions as the currently logged-on user. If a user is logged on with administrative user rights, an attacker could take complete control of the affected system. An attacker could then install programs; view, change, or delete data; or create new accounts with full user rights. Users whose accounts are configured to have fewer user rights on the system could be less impacted than users who operate with administrative user rights. https://learn.microsoft.com/en-us/security-updates/securityb... Probably didn’t have the level of adoption needed for the nefarious types to justify spending time finding Silverlight exploits. I loved silverlight. Before I got a “serious” job, I was a summer intern at a small civil engineering consultancy that had gradually moved into developing custom software that it sold mostly to local town/city/county governments in Arizona (mostly custom mapping applications; for example, imagine Google Maps but you can see an overlay of all the street signs your city owns and click on one to insert a note into some database that a worker needs to go repair it… stuff like that). Lots of their stuff was delivered as Silverlight apps. It turns out that getting office workers to install a blessed plugin from Microsoft and navigate to a web page is much easier than distributing binaries that you have to install and keep up to date. And developing for it was pure pleasure; you got to use C# and Visual Studio, and a GUI interface builder, rather than the Byzantine HTML/JS/CSS ecosystem. I get why it never took off, but in this niche of small-time custom software it was really way nicer than anything else that existed at the time. Web distribution combined with classic desktop GUI development. Sounds like a nice gig. > It turns out that getting office workers to install a blessed plugin from Microsoft and navigate to a web page is much easier than distributing binaries that you have to install and keep up to date. And developing for it was pure pleasure; you got to use C# and Visual Studio, and a GUI interface builder IIRC around that time, you could also distribute full-blown desktop applications (C# WinForms) in a special way via the browser, by which they were easily installable and self-updating. The tech was called ClickOnce https://learn.microsoft.com/en-us/visualstudio/deployment/cl.... I think the flow was possibly IE-only, but that was not a big issue in a business context at that time. Back in the day Microsoft sent someone to our university to demo all of their new and upcoming products. I remember Vista (then named Longhorn) and Silverlight being among them. I also remember people being particularly impressed by the demo they gave of the latter, but everything swiftly falling apart when someone queried whether it worked in other browsers. This was at a time when IE was being increasingly challenged by browsers embracing open standards. So there was an element of quiet amusement/frustration in seeing them continue to not get it. I sure liked Aldus Freehand a lot more than Adobe Illustrator although it has been so long that I don’t remember specifics other than I generally understood how to use it a lot better than Illustrator. Microsoft Songsmith is another one that deserved a second life. It let you hum or sing a melody and would auto-generate full backing tracks, guitar, bass, drums, chords, in any style you chose. It looked a bit goofy in the promo videos, but under the hood it was doing real-time chord detection and accompaniment generation. Basically a prototype of what AI music tools like Suno, Udio, or Mubert are doing today, fifteen years too early. If Microsoft had kept iterating on it with modern ML models, it could’ve become the "GarageBand for ideas that start as a hum." It also had one of the best campy promotional videos ever produced: https://www.youtube.com/watch?v=k8GIwFkIuP8 I will just leave this here: https://youtu.be/mg0l7f25bhU Cooperative Linux (coLinux) seemed like a cool concept. It let you run the Linux kernel alongside the Windows kernel while allowing both full access to the hardware. Unfortunately it hasn't fully made the jump from 32-bit to 64-bit. Ray Ozzie's Groove, by Groove Networks, embraced and extinguished by MSFT: Ozzie, who had previously worked at IBM, was particularly interested in the challenge of remote collaboration. His vision culminated in the creation of Groove, which was released in 2001. The software distinguished itself from other collaboration tools of the time by allowing users to share files and work on documents in real-time—even without a continuous internet connection. Groove’s architecture was innovative in that it utilized a peer-to-peer networking model, enabling users to interact directly with each other and share information seamlessly. This approach allowed for a level of flexibility and responsiveness that was often missing in traditional client-server models. Asynchronous collaboration was a key feature, where team members could work on projects without needing to be online simultaneously. https://umatechnology.org/what-happened-to-microsoft-groove/ We built some things on it, was like CRDT for all the things. I liked del.icio.us, it was online bookmark sharing, but with actual people I knew, and it had genuinely useful category tagging. I guess it was basically replaced with https://old.reddit.com and maybe twitter. I was always confused my del.icio.us, and bookmark sharing in general. In my head bookmarks are sharing are distinct things. Bookmarks are things I want to save to visit again or a shortcut to easily visit often. Sharing is something I think someone else might find interesting, and thinks others will too, but I probably won't ever visit again. I will bookmark the site to pay my utility bill, but it's not something I'd ever share. I might share a link to funny YouTube video, but wouldn't bookmark it. I think social bookmarking didn't really know what it was, which is why the modern versions are more about sharing links than bookmarking. I don't post my bookmarks to Reddit, where people follow me as a person. I would post links I think are worth sharing to a topic people are interested in following. Self hosted Linkding is a pretty great modern equivalent https://github.com/sissbruecker/linkding Isn’t Pinboard (Who bought delicious) very similar? I also see bookmarks of my friend there, recently switched to Raindrop though as it’s much more maintained. it is but people are switching away due to lack of maintenance and the founders political views Java Applets. All the buzz in the 2020's about WASM giving websites the ability to run compiled code at native speed, letting pages network with your server via WebRTC? Yeah, you could do that with Java Applets in 1999. If Sun (and later Oracle) had been less bumbling and more visionary -- if they hadn't forced you to use canvas instead of integrating Java's display API with the DOM, if they had a properly designed sandbox that wasn't full of security vulnerabilities? Java and the JVM could have co-evolved with JavaScript as a second language of the Web. Now Java applets are well and truly dead; the plugin's been removed from browsers, and even the plugin API's that allowed it to function have been deprecated and removed (I think; I'm not 100% sure about that). Java eventually got a DOM API but it was too late. https://docs.oracle.com/javase/tutorial/deployment/applet/ma... Maemo/Meego. I know there is Sailfish still around, but things would had been very different today if Nokia had put all its weight on it back then. TITCR. Hit ctrl-f and typed Meego as soon as I saw this thread, hoping I'd be the first. Alas. The N9 was literally a vision from an alternate timeline where a mobile platform from a major manufacturer was somehow hackable, polished, and secure. Favorite phone I've ever owned and I used it until it started to malfunction. Had a Jolla for a bit, too. It was nice to see them try to keep the basic ideas going but unfortunately it was a pain in the ass to use thanks to their decision to go with a radio that only supported GSM/EDGE in the US. Had to carry around a MiFi just to give it acceptable data service. I think the idea with Jolla is that if Nokia ever did an about-face, they were ready to be reabsorbed and get things back on the right track. Unfortunately, though we do once again have a "Nokia", it's just another Android white label with no interest in maintaining its own leading-edge smartphone platform. In my ideal world, Maemo/Meego and Palm's WebOS (not LG's bastardization of it) would be today's Android and iOS. Apple would have inevitably done their own thing, but it would have been really nice to have two widely used, mature and open mobile Linux platforms. I loved my N900, and my N800 before that, and I would have loved to have seen successors. Ultimately, I ended up switching to Android because I was tired of things only available as apps. Since then, web technologies have gotten better, and it's become much more feasible to use almost exclusively websites. > it's become much more feasible to use almost exclusively websites. And that's precisely why companies nerf their web sites and put a little popup that says "<service> works better on the app". Worth remembering it was the Microsoft partnership with Nokia that intentionally killed it. They should have partnered not only with Intel, but with Palm, RIM or whatever other then-giant to rival Android. Those two went their own ways with WebOS and buying QNX, so maybe they could have agreed to form a consortium for an open and interoperable mobile OS WebOS died in HP, after they bought Palm. I'm genuinely impressed at HP: somehow they always have the future in their hands... and kill it. When I saw the title, my first thought was also MeeGo. While I don't believe it would have been all that great had it not been abandoned, MeeGo absolutely should not have been discarded in such a disgraceful manner. I was gonna say Meego. They killed it just as it was getting to a usable state. One of the last chances we had to get a proper third option in the mobile market. Boot2Gecko or whatever the browser as Operating system was called. This was a project that should have focused on providing whatever its current users needed expanding and evolving to do whatever those users wanted it to do better. Instead it went chasing markets, abandoning existing users as it did so, in favour of potential larger pools of users elsewhere. In the end it failed to find a niche going forward while leaving a trail of abandoned niches behind it. I adored my Firefox Phones. Writing apps was so easy I built myself dozens of little one-offs. Imagine if it had survived to today, its trivial html/css/js apps could be vibe coded on-device and be the ultimate personalized phone. Luckily it wasn't long after Mozilla abandoned it that PWAs were introduced and I could port the apps I cared about. > Imagine if it had survived to today, its trivial html/css/js apps could be vibe coded on-device and be the ultimate personalized phone. That’s actually an incredibly cool concept. It lives on as KaiOS. Has limited success as a low end phone platform now. For a few short months circa 2016 or 2017, KaiOS was the number one mobile OS in India. This was probably because of all the ultra-cheap KaiOS-powered Reliance Jio phones flooding the Indian market at the time. I noticed the trend when I was working on a major web property for the Aditya Birla conglomerate. My whole team was pleasantly surprised, and we made sure to test everything in Firefox for that project. But everyone switched to Android + Chrome over the next few years, which was a shame. Today, India is 90% Chrome :( The signature function of the German ID card (“neuer Personalausweis”). Its 2025 and we still haven't solved secure online identification and we are still not using end-to-end encryption for e-mail, most e-mail is not even signed. Interaction with state agencies is still mostly via paper-based mail. The only successfully deployed online offer of the german state administration seems to be the online portal for tax filings “elster.de”. The use of a private key on the national ID card would have been able to provide all this and more using standard protocols. At least for identification, there is an expensive effort to re-design something similar in a smartphone-centric way and with less security and not based on standard approaches called “EUDI wallets”. For encrypted communication the agreed-on standard seems to be “log in to our portal with HTTPS and use our proprietary interfaces to send and receive messages”... Why did it die: Too expensive (~30€/year for certificate, >100€ for reader one time) and too complicated to use. Not enough positve PR. Acceptance at state-provided sites was added too late. In modern times, everything must be done with the smartphone, handling of physical cards is considered backwards hence this is probably not going to come back... Edit: Anothther simiarly advanced technoloy that also seems to have been replaced by inferiror substitute smartphone: HBCI banking (a standard...) using your actual bank card + reader device to authenticate transactions... replaced by proprietary app on proprietary smartphone OS... I really liked Google Circles, a feature of Google+ social media. It allowed you to target content to specific groups of users. You could have a "family" circle or a "work" circle and not have to worry about cross posting something accidentally. It was a small thing but it made it really easy to manage your posts. MS Sidewinder Force Feedback Pro (1997) and Sidewinder Force Feedback 2 (USB).
You can buy similar today, but nowhere near the pricepoint. Also the out of the box support by Windows has vanished, and therefore the incentive of game developers to include force feedback. I still have my MS Force Feedback 2, and it still works great! I heard that some patent troll got a hold of the patent for force feedback joysticks, and all manufacturers just gave up on them because of the troll. The patent expired recently IIRC, so hopefully people will start making them again soon. Geocities ; It was a "put your html here" Free web hosting back when people barely knew what html was. Today you have to be a rocket scientist to find a way to host a free static "simple" page online. Neocities[0] is going strong, if you just want an alternative. Copy paste your html to the online editor or upload your files, and that's it. GitHub pages is frankly the closest in my opinion, as someone who used Geocities to host a domain for years longer than I probably should have. … or just use Cloudflare Pages and upload a folder or zip of your static site via a web UI? Github pages Valid option - I used it myself for a very brief toe-dip into blogging earlier this year - but maybe worth noting that Google seems to flat-out refuse to crawl anything you put there. Won't pick it up by itself, won't read a sitemap you explicitly tell it about. It'll grudgingly index specific page URLs you tell it about, but that's kind of absurd. I don't know if it's because it's on a subdomain, or a Microsoft property, or because I was 100% ad- and tracker-free or what. I tried DDG (Bing-backed, I believe) and it happily found everything with no manual intervention at all. That was the point where I ditched Google Search after 30 years. I’ll bet you I could ask any LLM about it and have something launched within an hour. tumblr will practically let you do that for chrissake tumblr is nothing like a webpage. LLMs were just invented 5 minutes ago and are losing money hand over fist until people are dependent, then will be very expensive to use; and you still have to figure out how to host, where to host, and how much it's going to cost you. So, I have no idea what you're getting at. You could have said Wordpress.com or something. It's not quite a website, but it's close. It's also probably going to be Typepad (i.e. defunct) in a few years and Blogger is probably going to be there quicker than that. Ask the LLM about hosting too. I’ve literally gone through this process recently - setting up hosting, a domain, and a static html site from scratch, vibing from start to finish. It is not difficult. It is between one and two orders of magnitude harder than geocities, and infinitely more expensive. The "Eve" programming language / IDE - https://witheve.com It was a series of experiments with new approaches to programming. Kind of reminded me of the research that gave us Smalltalk. It would have been interesting to see where they went with it, but they wound down the project. Why did they not pursue this? Were there any applications using this in the wild? It was not immediately obvious from their github repository. I worked on this project so I can give some insight. The main reason we didn't keep working on it was it was VC funded and we didn't have a model for making money in the short term. At the end we were pursuing research related to natural language programming and reinforcement learning in that area (I recently blogged about it here: https://mech-lang.org/post/2025-01-09-programming-chatgpt), and were considering folding our small team into OpenAI or Microsoft or something. But we wanted to work as a team and no one wanted to take us as a team, so we called it. It didn't get far enough to be "used" in a production sense. There was enough interest and people were playing around with it, but no real traction to speak of. Frankly, language projects are difficult because these days they have to be bootstrapped to a certain size before there's any appreciable use, and VCs are not patient enough for that kind of timetable. Here's a postmortem Chris gave about all that: https://www.youtube.com/watch?v=WT2CMS0MxJ0 / https://www.youtube.com/watch?v=ThjFFDwOXok I don't personally know, but I used to use the creator (Granger)'s previous work, the Clojure live-running editor LightTable. LT was cool, but they abandoned it with insufficient hand-off when it was 80-90% done to work on Eve. I know a bunch of people were unhappy that LightTable wasn't finished, especially because they raised money via Kickstarter for it. Maybe Eve was too ambitious. Maybe funding never materialized. Maybe they just got bored and couldn't finish. Maybe they pissed off their audience. https://eyg.run/ is heavily inspired by eve! I know about this one as well: https://www.reddit.com/r/ProgrammingLanguages/comments/1ioij... but the author seems to have taken it private for now. I think he's the Gren author, which is a fork of Elm. As for me, I brought some eve-y ideas to my language project: https://github.com/mech-lang/mech I certainly know and admire eve.
However I don't think I consciously took that many features from it into EYG. I'd be curios what the crossover is The Lockheed D-21 drone. Supersonic ramjet without the complexity of scramjet or the cost of turbojet, hamstrung by the need for a manned launch platform (making operations safety-critical… with predictable results) and recovery to get data off it. Twenty or forty years later it would have been paired by a small number of high-cost launcher UAVs and had its cost driven down to disposable, with data recovery over radio comms… but twenty to forty years later there’s nothing like it, and the maturation of satellites means there almost certainly never will be. It's highly probable that a successor of this is in active use, we just don't know anything about it. :-/ Skype ; Because my R.I.P. grandma was using it to talk to her relatives overseas just like she would use a phone, but it didn't cost an arm and a leg (unlike phone calls). One of the best P2P software at the time. It was so simple and effective and allowed people to call real phones with Skype credit. A genius product ripped my Microsoft. Have you used Microsoft Teams recently? Bad UI, hard to configure external hardware and good level of incompatibility, missing the good old "Echo / Sound Test Service". At a point I even installed Skype of my old Android but was sucking up too much battery. Not just "of the time" - if you need to call international numbers as of 2025, there's no good replacement for Skype from earlier this year. I had a similar pleasant experience with Skype. Back in 2009, I was deployed to the Persian Gulf. This was before ubiquitous cell phones (at least, I left my cell phone back in the US). Phone cards worked to call home, but my cheap solution was to use Skype from my handheld PSP using Wi-Fi from a cafe. It worked out great for me at least, and I'll always appreciate that. First Class and Hotline. Server/Client. First Class had broader userbase, such as schools and organizations in the groupware/collaborative segment (but also Mac user groups and so on). First Class was a comercial product (the server). It had filesharing (UL/DL), it had it's own desktop, mail, chat, IM, voice mail and more. Started out on Mac, but later became cross platform. Still you can find presentations and setup guides on old forgotten University/Schools websites. Hotline on the other hand, was very easy to setup and also pretty lightweight. It had a server tracker. In the beginning it was Mac only. Lot's of warez servers, but also different (non-warez) communities. It had filesharing (ul/dl from the file area), chat and a newsboard. The decline came after it's developers released the Windows versions. Most servers became clickbait pron/warez with malware etc. People started to move away to web and it Hotline basically died out. Now, there was some open source/clone projects that kept the spirit alive. But after a few years, web forums, torrents and other p2p-apps took over. But there is some servers running still in 2025 and open source server/client software still developed. Compared to First Class. Hotline was the Wild West. It only took 15 minutes to set up your own server and announce it on a server tracker (or keep it private). When i use Discord and other apps/services, it's not hard to think of FC/HL. But then, they were solutions of it's time. More about:
https://en.wikipedia.org/wiki/FirstClass https://en.wikipedia.org/wiki/Hotline_Communications https://www.macintoshrepository.org/6691-hotline-connect-cli... I ran a hotline server in my formative teenage years (on a server in my bedroom with a static ip), and we all hung out there. It was absolutely great. Non Daw. Its breaking up each function of the DAW into its own application gave a better experience in each of those functions, especially when you only needed that aspect, you were not working around everything else that the DAW offers. The integration between the various parts was not all that it could be but I think the idea has some real potential. Thought about Non immediately, but I figured it must have (had) about 2 other users amongst HNers, though. :) Nice to see it mentioned. I used it quite a bit to produce radio shows for my country's public broadcasting. Because Non's line-oriented session format was so easy to parse with classic Unix tools, I wrote a bunch of scripts for it with Awk etc. (E.g. calculating the total length of clips highlighted with brown color in the DAW -- which was stuff meant for editing out; or creating a poor man's "ripple editing" feature by moving loosely-placed clips precisely side by side; or, eventually, converting the sessions to Samplitude EDL format, and, from there, to Pro Tools via AATranslator [1] (because our studio was using PT), etc. Really fun times!) I've never heard of this software before. Any idea why it's discontinued? There are a bunch of weird messages that point to sort of a hostile take over of the project by forking, but it doesn't say anything about why or how it was discontinued. From what I remember; it was mostly a one man project and he was writing it for himself, this upset some people and they felt his personal project should be democratic. It created a great deal of drama and he found himself having to deal with the drama every time he tried to engage with the community. Eventually he just walked away from it all. The fork died shortly after since the people who forked it were still dependent on him for development, all they really offered was a fork that was free of his supposed tyranny. Secure-Scuttlebot (the gossiped social network) died circa 2019 or 2024 depending who we ask. It died before it's time for various reasons including: 1. competing visions for how the entire system should work 2. dependence on early/experimental npm libraries 3. devs breaking existing features due to "innovation" 4. a lot of interpersonal drama because it was not just open source but also a social network the ideas are really good, someone should make the project again and run with it I tried it twice and the onboarding experience was insurmountable. Never managed to achieve a critical mass of followers or whatever they call it, so things were permanently read-only for me. I'd reply but nobody saw it. It was a fascinating protocol underneath, but the social follow structure seemed to select strongly for folks who already had a following or something. So much drama there too, but it's designed to attract drmas Drama has killed the technological progress in open source, if you ask me. Having seen what goes on in the foss world and what goes on in the large faang-size corporate world, no wonder the corporate world is light-years ahead. It is a fundamental constraint of consensus based organizations. You need hierarchy to move faster but that has other disadvantages. You don't need hierarchy, but you need some sort of process. "Consensus-based" just means that the loudest and most enduring shouters get their way, and when their way fails spectacularly, they leave in a huff (taking their work with them, badmouthing the project, and likely starting a fork that will pull more people out of the project and confuse potential users who just bail on trying either.) Those people need to be pushed out early and often. That's what voting is for. You need a supermajority to force an end to discussion, and a majority to make a decision. If you hold up the discussion too long with too slim a minority, the majority can fork your faction out of the group. If the end of debate has been forced, and you can't work with the majority, you should leave yourself. None of this letting the bullies get their way until everything is a disaster, then splitting up anyway stuff. I can recall a distinct time period where us ssb devs were passing around the url to "The Tyranny of Structurelessness" via local-first encrypted direct messages. The essay helped us understand what was happening but alas we did not have the tools to stop it happening to us! Hah. Naive take. I especially love this “Those people need to be pushed out early and often. That's what voting is for. You need a supermajority to force an end to discussion, and a majority to make a decision”. We know what needs to be done, but it’s not being done. There’s no consensus. Consensus take time and effort and has a lot of friction. I am part of a coop and I have seen first hand how this goes. And it’s fine, consensus based systems have other advantages, but they move slower that hierarchies. Nah, it is not. The core of the issue is that drama is a way to impose your views of the world. In foss software you quite literally don’t have to agree. You can fork the software and walk your own path. You can even pull changes from the original codebase, most licenses allow that. Consensus is only necessary if you care about imposing your views of the world onto others. ICQ ; It was the first instant messenger, the technology could have adopted voice (and not get disrupted by Skype) and mobile (and not get disrupted by whatsapp) and group chat (and not get disrupted by slack/discord). But they didn't even try and put up a fight. The last time ICQ was mentioned on HN I could still remember my ICQ number. It's a benchmark for how much my memory has deteriorated in the last five years. I do still remember it fondly, though. They got bought by AOL in 98, long before most/all of this innovation happened? Edit: in fact I'd say they were irrelevant before pretty much all of those innovations. By the time AIM or MSN Messenger really became popular, ICQ didn't matter anymore. RethinkDB. Technically it still exists (under The Linux Foundation), but (IMO) the original company's widening scope (the Horizon BaaS) that eventually led to its demise killed its momentum. Man I loves the original concept for demos but never build anything real with it. Curious if anyone did? Macromedia Flash. Its scope and security profile was too big. It gave way to HTML’s canvas. But man, the tooling is still no where near as good. Movieclips, my beloved. I loved it all. The iPhone killed Flash, probably because it would've been a way to create apps for it, more probably because it would've been laggy in the 2007 hardware, and people would've considered the iPhone "a piece of junk". Interesting how Flash became the almost universal way to play videos in the browser, in the latter half of the 2000's (damn I'm old...). It's incredible to me that they killed the whole tool instead of making a JS/Canvas port. Even without "full flash websites", there's still need for vectorial animations on the web. Adobe Animate (new name for Macromedia/Adobe Flash) can output to JS/Canvas now. Can it really do interactive things though, like games? The main draw card of Flash was its excellent integration of code and animation. I agree that the tooling was unbelievable…better for interactive web than anything that exists today AFAIK. I wonder why one one has managed to build something comparable that does work on a phone. I agree the tooling was great, .... for making apps/games for desktops with a mouse and keyboard and a landscape screen of at least a certain size. Maybe they could have fixed all that for touch screens, small portrait screens, and more but they never did make it responsive AFAIK. As a Linux user, I hated Flash with a passion. It mostly didn't work despite several Linux implementations. About the time they sorted all the bugs out, it went away. Good riddance. I for one am so glad Flash died. At one point I dreaded navigating to a new website because of it. ZeroNet decentralized web platform: - Based on BitTorrent ideas - Completely decentralized websites' code and data - Either completely decentralized or controllable-decentralized authentication - Could be integrated into existing websites (!) It's not kind of dead, there's a supported fork, but it still feels like a revolution that did not happen. It works really well. Definitely Opa: http://opalang.org/ In 2011, before TypeScript, Next.js or even React, they had seamless server-client code, in a strongly typed functional language with support for features like JSX-like inline HTML, async/await, string interpolation, built-in MongoDB ORM, CSS-in-JS, and many syntax features that were added to ECMAScript since then. I find it wild how this project was 90%+ correct on how we will build web apps 14 years later. Visual Basic 6 - arguably the most accessible way of creating GUI apps. you still have Lazarus, "a Delphi compatible cross-platform IDE for Rapid Application Development." Lazarus seems like a fantastic GUI builder, but the problem with it (and VB6) is that I have to use a language with 0.01% the ecosystem of Python. Lazarus is nice but both its apis and the ui feel like they're still stuck in the early 00's.
It's not enough to look like VB6 / Delphi these days; you've got to keep up with what kinds of conventions we expect now. Gambas is a modern, open source Visual Basic dialect in the style of VB Classic. It might be too soon to call it abandoned, but I was very intrigued by the Austral [1] language. The spec [2] is worth reading, it has an unusual clarity of thought and originality, and I was hoping that it would find some traction. Unfortunately it seems that the author is no longer actively working on it. [1] https://austral-lang.org/
[2] https://austral-lang.org/spec/spec.html I played with Austral about a year ago and really wanted to use it for my projects, but as a hobbyist and mostly inept programmer it lacked the community and ecosystem I require. I found it almost intuitive and the spec does an amazing job of explaining the language. Would love to see it get a foothold. Same with Vale: https://vale.dev ouch, last “recent update” in 2023. Any idea what happened? The author got hired by Modular, the AI startup founded by the creators of LLVM and Swift, and is now working on the new language Mojo.
He’s been bringing a bunch of ideas from Vale to Mojo Oh nice! I just had an excuse to try mojo via max inference, it was pretty impressive. Basically on par with vllm for some small benchmarks, bit of variance in ttft and tpot. Very cool! Dreamweaver or some other real WYSISYG web page editor that could maybe deal with very basic JavaScript. I just wanna make a mostly static site with links in and out of my domain. Maybe a light bit of interactivity for things like search that autocompletes. CLPM, the Common Lisp Package Manager. The Quicklisp client doesn't do HTTPS, ql-https doesn't do Ultralisp, and OCICL (which I'm currently using) doesn't do system-wide packages. CLPM is a great project, but it's gone neglected long enough that it's bitrotted and needs some thorough patching to be made usable. Fortunately Common Lisp is still as stable as it has been for 31 years, so it's just the code which interacts with 3rd-party libraries that needs updating. Yeah I felt that Quicklisp doesn't have the same features as package managers in other languages, and https is one of them. Also it's run by a single person which doesn't have too much time to constantly update the libraries. In comparison I found Clojars^[0] for Clojure better and community driven like NPM. But obv Clojure has more business adoption than CL. Do you use CL for work? [0]: https://clojars.org/ It's funny, on one hand I wouldn't want to use CL for work because when money gets involved in something you enjoy you stop enjoying it. On the other hand, however, I would really hate doing any serious work with a language I can't stand, like Python or Clojure. Was recently reading about Project Ara, the modular smartphone project by Google/Motorola [1]. Would have liked to see a few more iterations of the idea. Something more customizable than what we have today without having to take the phone apart. The Lisp machine. I love Lisp, and I love the idea of every part of the system being a Lisp program that can be patched and modified at runtime by the user. Obviously in this day and age some security mechanisms would need to be introduced, but the system design is my hacker's dream. Google Reader. We could have had a great society, man. The loss of Google Reader really does feel like the beginning of the end in retrospect. There are plenty of clones, though. I use CommaFeed and it's pretty good, feels a lot like Google Reader. Elm programming language. Arguably not dead but somewhat incomplete and not actively worked on. Try the Roc language https://www.roc-lang.org/ It's at a very early stage of development but looks promising It's been a number of years but my understanding was they kind of killed all the momentum it had by removing support for custom operators which broke everyone's code? A few commits recently. There are lots of competing MLs you can use instead: - F# (Fable) - ReasonML - OCaml (Bucklescript) - Haskell - PureScript IMO the problem with Elm was actually The Elm Architecture. What's "the Elm architecture"? A simple UI programming pattern, with a circular, unidirectional data flow. It is very rigid by design, to be side-effect free, functional, unidirectional: https://guide.elm-lang.org/architecture/ I'm no frontend guy, but I think it did/was inspire(d) react (redux?) maybe. Corrections on this very welcome Yes it was too rigid. Too much boiler plate. The design space of functional UI is still being explored. Correct - Elm was one of several inspirations for Redux: - https://redux.js.org/understanding/history-and-design/prior-... opa, along the same lines - really nice ML based language for isomorphic full stack web development. Yeah, Opa was wildly ahead of its time, I actually just wrote a top level comment about it. Basically Next.js+TypeScript+modern ECMAScript features, but in 2011. Adobe Fireworks - easiest vector / photo editor crossover app there ever was. It's a real shame its raster functionality wasn't integrated into Illustrator. Adobe really butchered the whole Macromedia portfolio, didn't they? (For those unfamiliar, Illustrator is a pure vector graphics editor; once you rasterize its shapes, they become uneditable fixed bitmaps. Fireworks was a vector graphics editor that rendered at a constant DPI, so it basically let you edit raster bitmaps like they were vectors. It was invaluable for pixel-perfect graphic design. Nothing since lets you do that, though with high-DPI screens and resolution-independent UIs being the norm these days, this functionality is less relevant than it used to be.) At my last job m our designer was a Fireworks holdout. It was very pleasant. As someone who has to implement UIs, I greatly preferred it to Figma, though with today's flat boring designs there's a lot less slicing. Gah. Fireworks and Dreamweaver were my "web designer" jumpstart. Ps and Ai had nothing on Fireworks Flickr - that was the future of photo storage, sharing, discovery. What was the bookmarks social tool called from 00’s? I loved it and it fell off the earth. You could save your bookmarks, “publish” them to the community, share etc.. What ever happened to those build your own homepage apps like startpage (I think)? I always thought those would take off >> What was the bookmarks social tool called from 00’s? del.icio.us! Funnily, also killed by yahoo like flickr wait, I am still using (and paying) Flickr ... Except it is now owned and run by the father and son who founded SmugMug. It probably has a chance of surviving under their leadership. I think the market narrowed a lot. I haven't been to Flickr in years, but I get the impression it's for more serious photographers now, like SmugMug. It was the Instagram of its day, with mass market appeal. I think that's what people miss. Not the site, but the community around it. In this same vein, I always thought Tumblr had a great design for a blog. It hits the perfect balance between a microblog like Twitter, and a fat blog like Wordpress. It had various stigma's around the type of people who posted there, which seems to have only gotten worse over the years. It is a shell of its former self and yet anther site that fell on hard times after Yahoo ownership. Yahoo really is where Web 2.0 went to die. Nokia Maps. There was a brief period in the early 2010s where Nokia had the best mapping product on the planet, and it was given away for free on Lumia phones at a time when TomTom and Garmin were still charging $60+ for navigation apps. Still around as "Here Maps" Started to suck pretty badly not long after getting acquired by German car companies. It used to be good. ADSL in the UK. BT had this grand vision for basically providing rich multi-media through the phone line, but in ~1998. Think a mix of on-demand cable and "teleconferencing" with TV based internet (ceefax/red button on steriods) It would have been revolutionary and kick started the UK's jump into online rich media. However it wouldnt have got past the regulators as both sky and NTL(now virgin) would have protested loudly. https://maruos.com/
https://en.wikipedia.org/wiki/Ubuntu_Edge Connect your phone to a display, mouse, keyboard and get a full desktop experience. At the time smartphones were not powerful enough, cables were fiddly (adapters, HDMI, USB A instead of a single USB c cable) and virtualization and containers not quite there. Today, going via pkvm seems like promising approach. Seamless sharing of data, apps etc. will take some work, though. The Atom code editor. It was good to have a mainstream alternative to VS Code, it's a pity it reached end-of-life. Atom was by GitHub, and VS Code by Microsoft. As soon as Microsoft bought VS Code, Atom’s fate was sealed. iirc Atom was the original Electron project. Eventually VS Code came along and took all the good ideas - the modularity through extensions, and Electron / web based cross platform, but made it really fast and added IDE like language support through LSP.
Atom may be dead now, but the idea lives on in VS Code and the new project by the original developers of Atom: Zed > iirc Atom was the original Electron project. Indeed (and the names are a clue). More specifically, Electron came out of the ideas in Atom, not the other way around. Electron was originally named Atom Shell https://www.electronjs.org/blog/electron/ Atom Shell/Electron was from the very beginning something you could use separately from Atom as a framework for creating desktop apps using Chromium/Node.js. I always thought Microsoft Popfly had huge potential and was way ahead of its time. It made building web mashups feel like playing with Lego blocks, drag, drop, connect APIs, and instantly see the result. If something like that existed today, powered by modern APIs and AI, it could become the ultimate no-code creativity playground. HP TouchPad Just on principle, I'd have liked to see it on the market for more than 49 days! It pains me as an engineer to think of the effort to bring a hardware device to market for such a minuscule run. webOS was so ahead of its time, and seemed like it would have been a really strong contender to iPad OS. I built a chatbot startup in 2015. It integrated with Whatsapp (which was possible at the time with some hacks), and had: - Multimodality: Text/audio/images input and output. Integrated OCR. - Connection with an asterisk server, it could send and receive voice phone calls! I used it to call for pizzas to a local place via whatsapp. This was prior to Google's famous demo calling a hairdresser to book a haircut. - It understood humor and message sentiment, told jokes and sometimes even chimed in with a "haha" if somebody said something funny in a group chat or sent an appropriate gif reaction - Memory (facts database) - Useful features such as scheduling, polling, translations, image search, etc. Regarding the tech, I used external models (Watson was pretty good at the time), plus classical NLP processing and symbolic reasoning that I learned in college. Nobody understood the point of it (where's the GUI? how do I know what to ask it? customers asked) and I didn't make a single dime out of the project. I closed it a couple years later. Sometimes I wonder what could've been of it. VPRI, I was really hoping it would profoundly revolutionise desktop application development and maybe even lead to a new desktop model, and instead they wound up the project without having achieved the kind of impact I was dreaming of. Anyone remember Openmoko, the first commercialised open source smart phone. Was heaps buggy though, not really polished, etc. It’s only redeeming feature was the open source software and hardware (specs?). There was the https://en.wikipedia.org/wiki/PinePhone and it's successor PinePhonePro. Bugginess and general impracticalities brought up to more recent standards. Inflation-adjusted, of course! Pinephone still is, and is set to be produced for next 3 years, as promissed in 2018. Windows Longhorn. It looked cool and had some promising features that never made it into Vista, like WinFS. * bzr: I always found git too much complex and not really ergonomic. I really liked bzr simplicity * Rethinkdb: I made some small projects with it in the past and it was easy to easy - Apple AirPort. Took a long time for advanced wifi solutions that "just work" to fill its place, and those are the Nest/Google things that have bs attached like mic+assistant. Unifi is too hard for consumers. - Gnome2 dropped from Ubuntu in favor of Unity - Ford Crown Victoria > Took a long time for advanced wifi solutions that "just work" to fill its place What is the modern-day successor? You only mentioned what isn't good. I loved the AirPort routers. I found it odd that Apple exited the market just as everyone else entered. I ended up getting a Linksys as a troubleshooting step when I was having internet issues. I don't think the AirPort was the issue, but after migrating, it didn't seem worth going back to a router that was effectively end of life. I still remember many years ago having a Sonos system and calling support due to some issues I was having. When they asked what type of router I had and I mentioned it was an AirPort, they immediately moved on to something else being the issue. The reputation was so solid that support wouldn't even bother troubleshooting it. Memex, it was a solution to the biggest problem facing the scientific community just after WW2 and it still hasn't been implemented, 80 years later! https://wiki.mozilla.org/Labs/Ubiquity https://en.wikipedia.org/wiki/IGoogle https://en.wikipedia.org/wiki/Google_Desktop and why? = UI/UX I feel like Zen (Firefox based) captures a few good things from Ubiquity. It could do more though. Zen + Kagi gets even more with the bang commands. JavaScript Style Sheets (JSS)
Introduced by netscape navigator 4, never came into mainstream as people were reluctant to give up CSS Gitless. I'm a fan of software that allows you to get your feet wet with simple concepts and progressively add complex ones when you feel you're ready. Gitless was my introduction to git. RAM Disks. Basically extremely fast storage using RAM sticks slotted into a specially made board that fit in a PCIe slot. Not sure what happened to the project exactly but the website disappeared sometime in 2023. The idea that you could read and write data at RAM speeds was really exciting to me. At work it's very common to see microscope image sets anywhere from 20 to 200 GB and file transfer rates can be a big bottleneck. Archive capture circa 2023:
https://web.archive.org/web/20230329173623/https://ddramdisk... HN post from 2023:
https://news.ycombinator.com/item?id=35195029 There's now a standard for memory over a physical PCIe interface (https://en.wikipedia.org/wiki/Compute_Express_Link) and off-the-shelf products (https://www.micron.com/products/memory/cxl-memory). I’m confused why this can’t be done in software? Products to attach RAM to expansion slots have long existed and continue to be developed. It's a matter of adding more memory once all of the DIMMs are full. What to do with it, once it's there, is a concern of software, but specialized hardware is needed to get it there. soon will be able to buy a gigabyte AI Top CXL R5X4. PCI expansion card with up to 512gb RAM over four DIMMs. You can do this in software, I tried it a few times with games and just other stuff ~10 years ago. Why would it have to be a hardware solution? Not really needed anymore on Linux with https://en.wikipedia.org/wiki/Zram https://wiki.archlinux.org/title/Zram https://wiki.gentoo.org/wiki/Zram for most purposes. (Assuming the host has enough RAM to spare, to begin with) Pocket. Never actually “read” anything later. But the dopamine hit of saving something with the click of a button to maybe find it later or tag. Yes there are solid alternatives, but Pocket had something sentimental about it. The IBM school's computer. Developed by IBM Hursley in 1967, it was years ahead in its design, display out to a television and storage on normal audio tape. Would have kick started an educational revolution if it had been launched beyond the 10 prototype machines. Died due to legal wranglings about patents, iirc. I've argued this for years on this site...but AOL. At its best, having IM, email, browser, games, keywords, chats, etc. was a beautiful idea IMO. That they were an ISP seemed secondary or even unrelated to the idea. But they chose to charge for access even in the age of broadband, and adopt gym level subscription tactics to boot, and people decided they'd rather not pay it which is to be expected. I often wonder if they'd have survived as a software company otherwise. They were basically a better thought out Facebook before Facebook, in my opinion. I miss AIM, and that type of messenger in general, a lot. You could purposely choose to be online or offline. Much easier to draw a line back then about how often you were online. Developer Ryan Flaherty's "Via" project, a novel approach to streaming large games in real time. https://www.youtube.com/watch?v=e5wAn-4e5hQ https://www.youtube.com/watch?v=QWsNFVvblLw Summary: >This presentation introduces Via, a virtual file system designed to address the challenges of large game downloads and storage. Unlike cloud gaming, which suffers from poor image quality, input latency, and high hosting costs, Via allows games to run locally while only downloading game data on demand. The setup process is demonstrated with Halo Infinite, showing a simple installation that involves signing into Steam and allocating storage space for Via's cache. >Via creates a virtual Steam library, presenting all owned games as installed, even though their data is not fully downloaded. When a game is launched, Via's virtual file system intercepts requests and downloads only the necessary game content as it's needed. This on-demand downloading is integrated with the game's existing streaming capabilities, leveraging features like level-of-detail and asset streaming. Performance metrics are displayed, showing download rates, server ping, and disk commit rates, illustrating how Via fetches data in real-time. >The system prioritizes caching frequently accessed data. After an initial download, subsequent play sessions benefit from the on-disk cache, significantly reducing or eliminating the need for network downloads. This means the actual size of a game becomes less relevant, as only a portion of it needs to be stored locally. While server locations are currently limited, the goal is to establish a global network to ensure low ping. The presentation concludes by highlighting Via's frictionless user experience, aiming for a setup so seamless that users are unaware of its presence. Via is currently in early access and free to use, with hopes of future distribution partnerships. I'm amazed the video still has under 4,000 views. Sadly, Flaherty got hired by XAI and gave up promoting the project. https://x.com/rflaherty71/status/1818668595779412141 But I could see the technology behind it working wonders for Steam, Game Pass, etc. Wait until you hear that almost all Unity games don't really have asset streaming because the engine loads things eagerly by default. I don't see how this could take off. Internet speeds are getting quicker, disk space is getting cheaper, and this will slow down load times. And what's worse is the more you need this tech the worse experience you have. Everpix: Looked like good execution but they were probably ahead of time. Also this: https://news.ycombinator.com/item?id=6676494 Redmart (Singapore): Best web based online store to this date (obviously personal view). No one even tries now that mobile apps have won. https://techcrunch.com/2016/11/01/alibaba-lazada-redmart-con... The prismatic news reader. It solved recommendations before the rest, but died because news died, and presumably made little money. Their attributed recommendations model is worth emulation. I don't remember if they supported both positive- and negative feedback, but Google news recommendation today do support attributed negative feedback. WebOS, the palm smartphone OS. It was beautiful at the time and predicted many of the swipe gestures iOS and Android adopted much later. systemd-fleet, by the original CoreOS folks. https://github.com/coreos/fleet I used this when it was brand new for a bit and it was so incredibly smooth and worked so well. It solved the problem of controlling systemd units remotely so well. I'm pretty sure the only reason it never actually took off was kubernetes and coreos's acquisition, however it actively solves the 'other half' of the k8s problem which is managing the state of the host itself. Visix Vibe. It was a "WYSIWYG"-type visual programming environment for .. Java. It had its own cross platform UI and other frameworks too, so you could "write once in Java, and ship on all the things" .. well theoretically. It got abandoned too soon. But it was quite fun to build apps with it for a while, almost Delphi- like. I always wonder if it went open source, if things would have been different vis a vis Flash, etc. wua.la … the original version. You share part of your storage to get the same amount back as resilient cloud storage from others. Was bought and killed by LaCie (now Seagate). They later provided paid-for cloud storage under the same name but it didn’t take off. In the late 90s there was a website called fuckedcompany which was a place where people could spill the beans about startups (mainly in silicon valley). It was anonymous and a pretty good view into the real state of tech. Now there is twitter/x but it's not as focused on this niche. creator now makes wild, bespoke headphones https://www.reddit.com/user/pudjam667/submitted/ The closest sites I've found are Web3 is Going Just Great and Pivot to AI, which are newsfeeds of various car crashes in their respective hype arenas, although without any insider scoops/gossip. fuckedcompany was awesome but very much a product of the early stages of the .com bubble poppage I kind of expect we might see something similar if the AI bubble pops I wonder who owns the domain now Sourcetrail: https://en.wikipedia.org/wiki/Sourcetrail People talk so much about how you need to write code that fits well within the rest of the codebase, but what tools do we have to explore codebases and see what is connected to what? Clicking through files feels kind of stupid because if you have to work with changes that involve 40 files, good luck keeping any of that in your working memory. In my experience, the JetBrains dependency graphs also aren't good enough. Sourcetrail was a code visualization tool that allowed you to visualize those dependencies and click around the codebase that way, see what methods are connected to what and so on, thanks to a lovely UI. I don't think it was enough alone, but I absolutely think we need something like this: https://www.dbvis.com/features/database-management/#explore-... but for your code, especially for codebases with hundreds of thousands or like above a million SLoC. Example: https://github.com/CoatiSoftware/Sourcetrail/blob/master/doc... Another example: https://github.com/CoatiSoftware/Sourcetrail/blob/master/doc... I yearn to some day view entire codebases as graphs with similarly approachable visualization, where all the dependencies are highlighted when I click an element. This could also go so, so much further - you could have a debugger breakpoint set and see the variables at each place, alongside being able to visually see how code is called throughout the codebase, or hell, maybe even visualize every possible route that could be taken. There was a virtual platform through which to learn Chinese called ‘Zon’. Someone obviously put years of work into it but no one ever joined and it turned into this great looking ghost town. Opa language 2012, it was a typed nextjs before its time. I think the market was still skeptical about nodejs on the server at the time but other than that I don’t really know why it didn’t take off I came to say Opa too. I liked the language but the meteor-like framework it was bundled with, while nice for prototyping, was a pain to work around when it didn't do what you needed. That said, frameworks were all the buzz back in the day, so the language alone probably wouldn't have gone anywhere without it. Launching under AGPL was the kiss of death. They eventually went MIT, but the developers it steered away, probably never gave it a second chance I am the only one that liked stereoscopic 3D. Up the frame rate (like in games and some movies) and it looks great! TrueCrypt. free multi-platform open source disk encryption that suddenly disappeared in mysterious circumstances choojs All of the upside and none of the downside of react No JSX and no compiler, all native js The main dev is paid by microsoft to do oss rust nowadays I use choo for my personal projects and have used it twice professionally https://github.com/choojs/choo#example The example is like 25 lines and introduces all the concepts Less moving parts than svelte You can get the same thing with lit-html and any of the add on libraries that flesh it out. For example, Haunted is a react hooks implementation for lit: https://github.com/matthewp/haunted Choo suffered from not having an ecosystem, same with mithtil and other "like react but not" also-rans. 10/GUI did some deep thinking about the limitations and potential of the (then-fairly new) multi touch input method.
I wished something more had come out of it, instead it stayed a niche concept art video that is mostly forgotten now. I’m not arguing the solutions it outlined are good, but I think some more discussion around how we interact with touch screens would be needed. Instead, we are still typing on a layout that was invented for mechanical typewriters - in 2025, on our touch screens. SMIL. Nothing comparable for seamless media stream composition, 20 years later. Pivotal Tracker ; Users loved it, it had an excellent model for tracking work and limiting work in progress on software projects. There is no real good alternative and the usual suspects for tracking project work are horrible in comparison. I don’t know about that. My employer was all in on Pivotal and we used it for several years. Then one day a dev stumbled across Linear, we all tried it, and switched the whole company within a month or so. Lanyrd.com & slideshare.com Nothing ever came close to easily find conferences to attend, and find the slides and conversation around them OS/2 my beloved. OS/2 had the best API that I’ve worked with. We did major banking apps in the early 90s. OS/2 was vastly superior to Windows NT and Windows. I was super excited for BeOS myself. I was a little surprised to have to scroll this far down to see BeOS come up. The first Amiga mention wasn't far above it either. I'm booting and running Haiku on my Thinkpad. It's a from-scratch workalike of BeOS, and able to run Be software. Though, frankly, Be software is totally 1990s, so a lot of Linux software written for Qt has been ported to Haiku. In the end I wound up with basically the same application software as on my Debian desktop, except running on Haiku instead of Linux. Haiku is noticeably snappier and more responsive than Linux+X+Qt+KDE, though. Did an install of OS/2 3.0 recently, and it was just as wonderful as the first time I used it. That team got so much so right. In late September or early October 1996, Fry's Electronics places a full page promo ad on the back of the business section of the San Jose Mercury News for OS/2 4.0 "WRAP [sic]" in 256 pt font in multiple places. Oops! The TUNES [1] operating system and programming language project.
The reason for its failure are described perfectly on the archival website: > TUNES started in 1992-95 as an operating system project, but was never clearly defined, and it succumbed to design-by-committee syndrome and gradually failed. Compared to typical OS projects it had very ambitious goals, which you may find interesting. I had seen it before and I did find it interesting, as well as some other ideas from other systems (including Amiga, TRON, capability-based systems, etc), and I had some of my own ideas. (I had also thought of some similar ideas independently, but not all of them.) I do not completely agree with all of the ideas. Also, I think that a new computer hardware design can be made to support the new operating system (although emulation is also possible). (My own operating system idea currently does not have a name, and has some similar ideas from TUNES (and TRON, etc) and many differences.) Gnome Conduit software. Used to synchronize a lot of my local-first data (calendar, photos, music) to different online services.
Nice to see in one place where everything goes and what is the sync status. CueCat it was an affordable barcode scanner that anyone could have connected to their computer, and it scanned barcodes. It took almost two decades before we could finally do it again with our mobile phones. MySpace, Soundcloud, etc.. A place where artists and consumers could freely communicated and socialize without hazzle. Died because of:
Stupidity, commercialisation and walled-gardening. Google Stadia. I want to try games, but don't want to own a tv, desktop, or windows anything. Google Wave ; It had a bunch of agents participating in editing the text together with you, making spelling fixes, finding additional information to enrich your content, and so much more. Docker Swarm, so much easier to use than k8s. Still my preferred solution for hosting web apps. Firefox panorama: showed a view all your tabs as thumbnails and let you organize them into groups visually. netflix falcor. the graphql hype killed a much better alternative for many usecases. there were only a few missing pieces and improvements such as a proxy based adapter layer for popular frontend frameworks. Im now the lonely last user hoping to find a way to reboot development Google Wave. It was horrible from a performance point of view, but was really interesting to use. Some of
its features have made their way into the Google docs etc ecosystem and Office 365. But not all XenClient. I would really love to have some minimal OS HyperVisor running, and then you slap multiple OSes on top of that w/ easy full GUI switching via some hotkeys like Ctrl+Shift+F1. Additionaly, special drivers to virtualize Gfx and Sfx devices so every VM have full desktop capabilities and low latency. Unfortunately, it died because its very niche and also they couldnt keep up with development of drivers for desktops.. This is even worse today... Microsoft Courier. Dual screen iPad killer, productivity optimised. IIRC Microsoft OneNote is its only legacy. Killed because both the Windows team and the Office team thought it was stepping on their toes. I could think of many examples, but I'll talk about the top four that I have in mind, that I'd like to see re-evaluated for today's times. 1. When Windows Vista was being developed, there were plans to replace the file system with a database, allowing users to organize and search for files using database queries. This was known as WinFS (https://en.wikipedia.org/wiki/WinFS). I was looking forward to this in the mid-2000s. Unfortunately Vista was famously delayed, and in an attempt to get Vista released, Microsoft pared back features, and one of these features was WinFS. Instead of WinFS, we ended up getting improved file search capabilities. It's unfortunate that there's been no proposals for database file systems for desktop operating systems since. 2. OpenDoc (https://en.wikipedia.org/wiki/OpenDoc) was an Apple technology from the mid-1990s that promoted component-based software. Instead of large, monolithic applications such as Microsoft Excel and Adobe Photoshop, functionality would be offered in the form of components, and users and developers can combine these components to form larger solutions. For example, as an alternative to Adobe Photoshop, there would be a component for the drawing canvas, and there would be separate components for each editing feature. Components can be bought and sold on an open marketplace. It reminds me of Unix pipes, but for GUIs. There's a nice promotional video at https://www.youtube.com/watch?v=oFJdjk2rq4E. OpenDoc was a radically different paradigm for software development and distribution, and I think this was could have been an interesting contender against the dominance that Microsoft and Adobe enjoys in their markets. OpenDoc actually did ship, and there were some products made using OpenDoc, most notably Apple's Cyberdog browser (https://en.wikipedia.org/wiki/Cyberdog). Unfortunately, Apple was in dire straits in the mid-1990s. Windows 95 was a formidable challenger to Mac OS, and cheaper x86 PCs were viable alternatives to Macintosh hardware. Apple was an acquisition target; IBM and Apple almost merged, and there was also an attempt to merge Apple with Sun. Additionally, the Macintosh platform depended on the availability of software products like Microsoft Office and Adobe Photoshop, the very types of products that OpenDoc directly challenged. When Apple purchased NeXT in December 1996, Steve Jobs returned to Apple, and all work on OpenDoc ended not too long afterward, leading to this now-famous exchange during WWDC 1997 between Steve Jobs and an upset developer (https://www.youtube.com/watch?v=oeqPrUmVz-o). I don't believe that OpenDoc fits in with Apple's business strategy, even today, and while Microsoft offers component-based technologies that are similar to OpenDoc (OLE, COM, DCOM, ActiveX, .NET), the Windows ecosystem is still dominated by monolithic applications. I think it would have been cool had the FOSS community pursued component-based software. It would have been really cool to apt-get components from remote repositories and link them together, either using GUI tools, command-line tools, or programmatically to build custom solutions. Instead, we ended up with large, monolithic applications like LibreOffice, Firefox, GIMP, Inkscape, Scribus, etc. 3. I am particularly intrigued by Symbolics Genera (https://en.wikipedia.org/wiki/Genera_(operating_system)), an operating system designed for Symbolics Lisp machines (https://en.wikipedia.org/wiki/Symbolics). In Genera, everything is a Lisp object. The interface is an interesting hybrid of early GUIs and the command line. To me, Genera could have been a very interesting substrate for building component-based software; in fact, it would have been far easier building OpenDoc on top of Common Lisp than on top of C or C++. Sadly, Symbolics' fortunes soured after the AI winter of the late 1980s/early 1990s, and while Genera was ported to other platforms such as the DEC Alpha and later the x86-64 via the creation of a Lisp machine emulator, it's extremely difficult for people to obtain a legal copy, and it was never made open source. The closest things to Genera we have are Xerox Interlisp, a competing operating system that was recently made open source, and open-source descendants of Smalltalk-80: Squeak, Pharo, and Cuis-Smalltalk. 4. Apple's "interregnum" years between 1985 and 1996 were filled with many intriguing projects that were either never commercialized, were cancelled before release, or did not make a splash in the marketplace. One of the most interesting projects during the era was Bauhaus, a Lisp operating system developed for the Newton platform. Mikel Evins, a regular poster here, describes it here (https://mikelevins.github.io/posts/2021-07-12-reimagining-ba...). It would have been really cool to have a mass-market Lisp operating system, especially if it had the same support for ubiquitous dynamic objects like Symbolic Genera. > It's unfortunate that there's been no proposals for database file systems for desktop operating systems since. You can have one today if you want, although nobody knows about it. Step 1. Install a local Oracle DB https://hub.docker.com/r/gvenzl/oracle-free#quick-start Step 2. Set up DBFS https://docs.oracle.com/en/database/oracle/oracle-database/2... Step 3. Mount it via FUSE or NFS. Step 4. Also access the underlying tables via SQL. > OpenDoc For anyone interested in the Apple future that could have been, check out Jim Miller's articles, e.g. on LiveDoc (https://www.miramontes.com/writing/livedoc/index.php) OpenDoc was mostly given to Taligent (the Apple and IBM joint venture) to develop. It was full-on OO: about 35 files for a minimal application, which meant that Erich Gamma had to build a whole new type of IDE which was unusable. He likely learned his lesson: it's pretty hard to define interfaces between unknown components without forcing each one to know about all the others. MIME types for mail addressed much of the demand for pluggable data types. Re: obtaining a legal copy of Genera, as of 2023 Symbolics still existed as a corporate entity and they continued to sell x86-64 laptops with "Portable Genera 2.0". I bought one from them then, and occasionally see them listing some on Ebay. (This isn't intended as an advertisement or endorsement, just a statement. I think it's quite unfortunate that Symbolics's software hasn't been made freely available, since it's now really only of historical interest.) I'm intrigued by Symbolics Genera too. It would have been interesting seeing further development of Lisp OS, especially when they would have had internet connection. Rewriting part of your OS and see the changes in real time? Maybe web apps could have been just software written in Lisp, downloaded on the machine and directly being executed in a safe environment on top of the Genera image. Big stuff. Windows Phone Windows Phone's UI is still with us, from Windows 8 onwards. Everything on 8, 10, and 11 is optimized for a touch interface on a small screen, which is ridiculous on a modern desktop with a 32" or so monitor and a trackball or mouse. False. The Metro design was abandoned long ago. No live tiles, no typography-first minimal UIs in windows 10/11. I pin an email app to taskbar/start, I don't see the unread count. From Windows 10, there is a switch between desktop and touch mode. They stopped supporting small tablets some years ago though, and made it worse with every Windows update. I can only surmise that it was to make people stop using them. Slow GUI, low contrast, killed apps. I thought Google Wave was going to kill email and chat and a whole bunch of other stuff. Artifact by instagram founders. I discovered it (and thought it's great) by reading their termination announcement. SGI Irix, and SGI hardware in general, should be revived and return to the scene. I'd love to have an SGI laptop. Or an SGI cell phone or VR headset. Mozilla heka. As far as data collection and processing goes, we are still stuck with Logstash after all of these years. Heka promised a much more efficient solution, being implemented with Go and Lua plugins. https://www.kite.com for python
i first learned about it when i was working in an university group and had the task to transform a windowing algorithm already working on matlab to python.
it felt like a modern linter and lsp with additional support through machine learning. i don't quite know why it got comparative small recognition, but perhaps enough to remain an avantgarde pioneering both python and machine learning support for further generations and wider applications. https://www.kite.com for python i first learned about it when i was working in an university group and had the task to transform a windowing algorithm already working on matlab to python.
it felt like a modern linter and lsp with additional support through machine learning. i don't quite know why it got comparative small recognition, but perhaps enough to remain an avantgarde pioneering both python and machine learning support for further generations and wider applications. Perhaps because their repeated bad behavior as a company outweighed anything good they put out. Gentoo file manager. (Not the Linux distribution with the same name) I have used it for years. A two pane manager, it makes defining file associations, applications invoked by extensions and short cut buttons easy convenient. Sadly it is abandonware now. Slowly migrating to Double Commander now... All those modular smartphones, and also Amazon's Fire phone. Why? Obviously close-to-zero market. It was unbelievable how those people though those projects would even succeed. Fortress language. It suffered from being too Haskell-like in terms of too many, non-orthogonal features. Rust and Go applied lessons from it perhaps indirectly. Fortress had great ideas, but I'd say the closest thing to in the real world now might be Julia. their operator precedence system was one of my favourite pieces of language design. the tl;dr was that you could group operators into precedence sets, and an expression involving operators that all came from the same set would have that set's precedence rules applied, but if you had an expression involving mixed sets you needed to add the parentheses. crucially, they also supported operator overloading, and the same operator could be used in a different set as long as everything could be parsed unambiguously. (caveat, I never used the language, I just read about the operator design in the docs and it was very eye opening in the sense that every other language's operator precedence system suddenly felt crude and haphazard) X.400 we're approaching it by stepwise refinement. It had X.500 which lives on as X.509 certificates and LDAP. ISO/OSI had session layer. ie much of what QUIC does regarding underlying multiple transports. Speaking of X.509 the s-expressions certificate format was more interesting in many ways. OSI's session layer did very little more than TCP/UDP port numbers; in the OSI model you would open a connection to a machine, then use that connection to open a session to a particular application. X.400 was a nice idea, but the ideal of having a single global directory predates security. I can understand why it never happened On X.509, the spec spends two chapters on attribute certificates, which I've never seen used in the wild. It's a shame; identity certificates do a terrible job at authentication CORBA, it got hopelessly complex but it's full potential was never reached as the greed heads took it over. Metasploit Incident Response Vehicle (MIRV). Was super pumped when it was announced, it later died in obscurity. Adobe Flex with Adobe Catalyst. Design a GUI in Photoshop, export it to Flex/Flash to add interactivity. Looked cool during demos. Got killed when Flash died. Fro me, DESQview. Microsoft tried to buy it in order to use its tech in their windows system. I wonder how things would be today if they were able to purchase it. But DESQview said "no". Instead it went into a slow death spiral due to Windows 95. Love seeing this one. My uncle was co-founder of Quarterdeck, and I grew up in a world of DESQview and QEMM. It was a big influence on me as a child. Got a good family story about that whole acquisition attempt, but I don't want to speak publicly on behalf of my uncle. I know we've talked at length about the what-ifs of that moment. I do have a scattering of some neat Quarterdeck memorabilia I can share, though: https://www.dropbox.com/scl/fo/0ca1omn2kwda9op5go34e/ACpO6bz... DESQview/X sucked the wind out of DESQview's sails. It was, on paper, a massive upgrade. I had been running DESQview for years, with a dial-up BBS in the background. But you couldn't actually buy /X. After trying to buy a copy, my publisher even contacted DESQ's marketing people to get a copy for me, and they wouldn't turn one over. Supposedly there were some copies actually sold, but too few, too late, and then /X was dropped. There was at least one more release of plain DESQview after that, but by then Windows was eating its lunch. killed more like it: but I miss the old Sun/Solaris/Sparc days. make hardware expensive again! Zenbe, a cute and practical webmail interface. Bought and killed by Facebook way too soon! Humane AI Pin. I think they launched 2 years too early and were too greedy with device pricing and subscription. Also if they focused as accessory for Android/iPhone they could reduce power usage and cost as well. Their execution was of course bad but I think today current LLM models are better and faster and there is much more OSS models to reduce costs. Hardware though looked nice and pico projector interesting concept even though not the best executed. Wine predates ReactOS. It was basically a FOSS duplicate of Sun's WABI. I wrote a bunch of software in Borland Delphi, which ran in Windows, Wine, and ReactOS with no problems. Well, except for ReactOS' lack of printing support. As long as you stay within the ECMA or published Windows APIs, everything runs fine in Wine and ReactOS. But Microsoft products are full of undocumented functions, as well as checks to see if they're running on real Windows. That goes back to the Windows 3.1 days, when 3.1 developers regularly used OS/2 instead of DOS, and Microsoft started adding patches to fail under OS/2 and DR-DOS. So all that has to be accounted for by Wine and ReactOS. A lot of third-party software uses undocumented functions as well, especially stuff written back during the days when computer magazines were a thing, and regularly published that kind of information. A lot of programmers found the lure of undocumented calls to be irresistible, and they wound up in all kinds of commercial applications where they really shouldn't have been. In my experience anything that will load under Wine will run with no problems. ReactOS has some stability problems, but then the developers specifically call it "alpha" software. Despite that, I've put customers on ReactOS systems after verifying all their software ran on it. It gets them off the Microsoft upgrade treadmill. Sometimes there are compatibility problems and I fall back to Wine on Linux. Occasionally nothing will do but real Windows. Hard disagree. The Humane AI Pin ad was a classic silicon valley ad that screamed B2VC and demonstrated nothing actually useful that couldn't be done with an all-in-one phone app (or even the ChatGPT app) and bluetooth earbuds that you already have. Which reduces its innovation level to nothing more than a chest-mounted camera. You want real B2C products that people would actually buy? Look at the Superbowl ads instead. Then watch the Humane ad again. It's laughable. The Starling Home Hub. Best way to bring Google/Nest hardware into HomeKit. Killed by Trump's tariffs. LSR, the "Linux Screen Reader", an ambitiousy designed Python implementation of a GUI screen reader developed by IBM starting around 2006 or so. The project was ended 2008 when IBM ended all its Accessibility involvement in FLOSS. IPv6 - Everything was supposed to be flat, devices with just one unique IP addr. Nope. It accounts for half of all Google’s customer traffic. It’s not dead, not by a long shot. If you have a mobile phone, you’re almost certainly using IPv6 today. The flat Internet is dead. Ah, I see the part you’re emphasizing. Yeah. That still stings. It didn’t have to be this way. Ceylon, JVM language, developed by Red Hat, now abandoned at Eclipse. Lost the race with Kotlin but proposed more than just syntax sugar over Java. Anonymous union types, comprehensions, proper module system... I really liked Ceylon. It was competing against Groovy, Kotlin, and Scala which all seemed to come out around the same time. All my ideas :') Also, I did not experience them personally, but I love watching computing history videos on YouTube, and a lot of the computers and operating systems from the 1980s and early 1990s got buried too soon, mostly because of their owners being short-sighted idiots in not realizing the full potential of what computers and video games could become, and having wildly successful hits on their hands with legions of faithful fans but not knowing how to build on that success or what the fans actually wanted to see in updated hardware. Plone CMS. When it appeared in 2001, there was nothing comparable. I'm not sure there still is. It was very flexible, allowed to build complex websites from components. Many ideas were pretty novel, at least I've never seen them in any web framework/CMS before. It still exists but nowhere as popular as it was in 2000-2010s. Knowing when to say "no" to a project is an important skill. One always must define a one sentence goal or purpose, before teams think about how to build something. Cell processors, because most coders can't do parallelism well Altera consumer FPGA, as they chose behavioral rather than declarative best practices... then the Intel merger... metastability in complex systems is hard, and most engineers can't do parallelism well... World Wide Web, because social-media and Marketers Dozens of personal projects, because sometimes things stop being fun. =3 I remember being quite disappointed that PowerPC did not, contrary to expectations, dethrone the Intel hegemony. Google Glass. Thanks society. People always fail to see something that is an inevitability. Humans lack foresight because they don't like change. At least with a smartphone it’s pretty clear when someone is filming you. Google Glass was too much of an enabler for creeps. nah, glass was impressive for a such a big org like google, but smartphones are popular because people use them like portable televisions. glanceable info and walking directions are more like an apple watch sized market, without the fashion element. meta is about to find out. Google Wear is pretty much Google Glass on your wrist, so you don't burn out your eyes looking up and to the side. Wild that people would downvote your low stake personal opinion given as a direct ask from OP. I am 100% with you. Google Glass was so much before its time, it might be reinvented a few more times and abandoned again before finally becoming a success. yea, crazy, I upvoted just now. google glass sucks though and glasses will never be a thing. google and meta and … can spend $8T and come up with the most insane tech etc but no one will be wearing f’ing glasses :) I don’t think people are downvoting for the mention of Google Glass, but due to the rest of the comment making a value judgement many are sure to disagree with. Namecoin and Decentralized DNS. Web 3.0 in general is pretty much dead, federal services are not a replacement, we really needed decentralized services right about now Theranos Have to agree. This whole procedure of booking an appointment with a GP who then books you an appointment with a lab who then takes your blood is a huge waste of time. The technology is largely there for people to continuously monitor their health in real time, you see this in smartwatches as feature by feature slowly trickles in. Betamax. Because I bought a player and it gave better quality video. Technology Connections would like a word. I bought my Betamax in the store having compared VHS to Beta side by side. When I say that Beta was better the main thing I'm referring to is the clarity during fast forwarding or reversing. When scanning through a video VHS was full of staticky noise, but Beta was clean. I don't recall that the quality during regular playback was particularly different, but I like to be able to see what's happening when I move around a video. It seems like a clear winner. Of course, this was comparing one particular beta machine to one particular VHS machine. [flagged] So you think the 'prime time' for American democracy was not in the past, nor the present, but will eventually happen in the future? Or are you claiming that it died a long, long, time ago, and now is when we really need it, but it wasn't needed before? Is American democracy a 'project'? I am generally not one to engage in online political discussion, especially when it stems from such a glib and self serving post as GP but it is my thread and I am running out of time in the year to hit my quota (1). Project seems a fitting description of American democracy and the project aspect is part of what makes it American. We the people are working towards that more perfect Union even if at times it does not seem it, the system mostly works but there is no straight line between where we started and that more perfect Union and whatever that more perfect union is, is not a constant. We do get lost along the way, take round about paths, sidestep, go backwards, etc, it is a requirement of the shared aim and sometimes a step backwards is actually a step forward. We can't achieve that more perfect Union, all we can do is keep trying and hope we are more or less going in the correct general direction. As long as American democracy keeps evolving it is alive and has held up better than one would expect. For at least a century the country has been run by parties that supposedly want to kill American democracy, strangle it with their ideology and defeat the enemies of American democracy which just happen to be their political rivals; we would live in a utopia if it was not for <political party>. But they keep failing and the project continues on. Founder perspective: “avoid patents by staying 20 years behind” is the tragedy.
I published a 2-page CC0 initiative that splits protection into two layers:
• GLOBAL layer — fast, low-friction recognition for non-strategic inventions
• LOCAL-STRATEGIC layer — conventional national control for sensitive tech
Goal: cut admin drag/time-to-market while keeping sovereignty intact. Brief (CC0): https://doi.org/10.5281/zenodo.17305774
Curious: would this structure have saved any of the projects mentioned here? Apple’s scanning system for CSAM. The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked. It was an extremely interesting effort where you could tell a huge amount of thought and effort went into making it as privacy-preserving as possible. I’m not convinced it’s a great idea, but it was a substantial improvement over what is in widespread use today and I wanted there to be a reasonable debate on it instead of knee-jerk outrage. But congrats, I guess. All the cloud hosting systems scan what they want anyway, and the one that was actually designed with privacy in mind got screamed out of existence by people who didn’t care to learn the first thing about it. Good riddance to a system that would have provided precedent for client-side scanning for arbitrary other things, as well as likely false positives. > I wanted there to be a reasonable debate on it I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate. We need to just keep making it clear the answer is "no", and hopefully strengthen that to "no, and perhaps the massive smoking crater that used to be your political career will serve as a warning to the next person who tries". This. No matter how cool the engineering might have been, from the perspective of what surveillance policies it would have (and very possibly did) inspire/set precedent for… Apple was very much creating the Torment Nexus from “Don’t Create the Torment Nexus.” > from the perspective of what surveillance policies it would have (and very possibly did) inspire/set precedent for… I can’t think of a single thing that’s come along since that is even remotely similar. What are you thinking of? I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular? The problem isn’t the system as implemented; the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.” Once that idea appears, it allows every lobbyist and insider to say “mandate this, we’ll do something like what Apple did but for other types of Bad People” and all of a sudden you have regulations that force messaging systems to make this possible in the name of Freedom. Remember: if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image. There are many in politics for whom that level of control is the actual goal. > The problem isn’t the system as implemented Great! > the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.” Apple never made that assertion, and the system they designed is incapable of doing that. > if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image. Apple’s system cannot do that. If you change parts of it, sure. But the system they proposed cannot. To reiterate what I said earlier: > The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked. So far, you are saying that you don’t have a problem with the system Apple designed, and you do have a problem with some other design that Apple didn’t propose, that is significantly different in multiple ways. Also, what do you mean by “model”? When I used the word “model” it was in the context of using another system as a model. You seem to be using it in the AI sense. You know that’s not how it worked, right? > I can’t think of a single thing that’s come along since that is even remotely similar. What are you thinking of? Chat Control, and other proposals that advocate backdooring individual client systems. Clients should serve the user. > Chat Control, and other proposals that advocate backdooring individual client systems. Chat Control is older than Apple’s CSAM scanning and is very different from it. > Clients should serve the user. Apple’s system only scanned things that were uploaded to iCloud. You missed the most important part of my comment: > I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular? I don’t think you can accurately describe it as client-side scanning and false positives were not likely. Depending upon how you view it, false positives were either extremely unlikely, or 100% guaranteed for practically everybody. And if you think the latter part is a problem, please read up on it! > I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate. Right, well I wanted a debate. And Apple changed their minds. So how is it reminding you of that? Neither of those things apply here. Forgot about the concept of bugs have we? How about making Apple vulnerable to demands from every government where they do business? No thanks. I'll take a hammer to any device in my vicinity that implements police scanning. > Forgot about the concept of bugs have we? No, but I have a hard time imagining a bug that would meaningfully compromise this kind of system. Can you give an example? > How about making Apple vulnerable to demands from every government where they do business? They already are. So are Google, Meta, Microsoft, and all the other giants we all use. And all those other companies are already scanning your stuff. Meta made two million reports in 2024Q4 alone. Imagine harder. Apple has had several high profile security bugs in the last few years, and their OS is decried here as a buggy mess every release. QA teams went out of fashion. The onus is on you to prove perfection before ruining lives on hardware they paid for. 100x worse on the vulnerability front, as the tech could be bent to any whim. Importantly, none of what you described is client-side scanning. Even I consider abiding rules on others’ property fair. There is no place for spyware of any kind on my phone. Saying that it is to "protect the children" and "to catch terrorists" does not make it any more acceptable. Do you have any phones without spyware? I believe my retro Nokia phones s60/s90 does not have any spyware. I believe earlier Nokia models like s40 or monochrome does not even have an ability to spy on me (but RMS considers triangulation as spyware). I don't believe any products from the duopoly without even root access are free from all kinds of vendor's rootkits. > The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked. But not very different to how it was actually going to work, as you say: > If you change parts of it, sure. Now try to reason your way out of the obvious "parts of it will definitely change" knee-jerk. I’m not sure I’m understanding you. Apple designed a system. People guessed at what it did. Their guesses were way off the mark. This poisoned all rational discussion on the topic. If you imagine a system that works differently to Apple’s system, you can complain about that imaginary system all you want, but it won’t be meaningful, it’s just noise. You understand it just fine, you're just trying to pass you fantasy pod immutable safe future as rational while painting the obvious objections based on the real world as meaningless noise. Your point did not come across. It still isn’t. I don’t know what you mean by “pass you fantasy pod immutable safe future as rational”. You aren’t making sense to me. I absolutely do not “understand it just fine”. If they are running safe mandatory scans on your phones for this, you seem shocked and angry that anyone would imply that this would lead to safe mandatory scans on your phones for that and the other, and open the door for unsafe mandatory scans for whatever. If you can't acknowledge this, it puts you in a position where you can't be convincing to people who need you to deflect obvious, well-known criticisms before beginning a discussion. It gives you crazy person or salesman vibes. These are arguments that someone with a serious interest in the technology would be aware of already and should be included as a prerequisite to being taken seriously. Doing this shows that you value other people's time and effort. > you seem shocked and angry that anyone would imply that this would lead to safe mandatory scans on your phones for that and the other Where have I given you that impression? The thing that annoys me is the sensible discussion being drowned out by ignorance. > If you can't acknowledge this, it puts you in a position where you can't be convincing to people who need you to deflect obvious, well-known criticisms before beginning a discussion. I cannot parse this, it’s word salad. People who need me to deflect criticisms? What? I genuinely do not understand what you are trying to say here. Maybe just break the sentences up into smaller ones? It feels like you’re trying to say too many things in too few sentences. What people? Why do they need me to deflect criticisms?
renehsz - 10 hours ago
teddyh - 7 hours ago
EdiX - 6 hours ago
Shugyousha - 3 hours ago
mycall - 7 hours ago
ajross - 5 hours ago
tjchear - 7 hours ago
mike_hearn - an hour ago
c0balt - 7 hours ago
Someone - 5 hours ago
naasking - an hour ago
vacuity - 44 minutes ago
WD-42 - 7 hours ago
cratermoon - 6 hours ago
exec 5<>/dev/tcp/www.google.com/80
echo -e "GET / HTTP/1.1\r\nhost: www.google.com\r\nConnection: close\r\n\r\n" >&5
cat <&5
Zambyte - 4 hours ago
It is an abstraction in GNU Bash. ls /dev/tcp
IshKebab - 4 hours ago
vacuity - 42 minutes ago
Animats - 14 hours ago
ndiddy - 9 hours ago
dimal - 7 hours ago
markasoftware - 7 hours ago
jasode - 6 hours ago
bawolff - 6 hours ago
aleph_minus_one - 5 hours ago
mikehall314 - 32 minutes ago
jakelazaroff - 6 hours ago
troupo - 7 hours ago
jasode - 12 hours ago
crote - 11 hours ago
pixl97 - 4 hours ago
bsimpson - 8 hours ago
drob518 - 6 hours ago
layer8 - 5 hours ago
afavour - 10 hours ago
JimDabell - 9 hours ago
Fluorescence - 8 hours ago
JimDabell - 14 hours ago
maratc - 10 hours ago
JimDabell - 10 hours ago
reactordev - 10 hours ago
yoz-y - 10 hours ago
reactordev - 7 hours ago
simonask - an hour ago
maratc - 8 hours ago
reactordev - 7 hours ago
maratc - 6 hours ago
d3Xt3r - 28 minutes ago
eterm - 13 hours ago
bazoom42 - 13 hours ago
eterm - 11 hours ago
What is line 22? Error Line 22, Column 4: end tag for "b" omitted, but OMITTAG NO was specified
It's up to you to go hunting back through the document, to find the un-closed 'b' tag. </p>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<?xml-stylesheet href="http://www.w3.org/StyleSheets/TR/W3C-WD.css" type="text/css"?>
<html xmlns="http://www.w3.org/1999/xhtml" lang="en" xml:lang="en">
<head>
<title>test XHTML 1.0 Strict document</title>
<link rev="made" href="mailto:gerald@w3.org" />
</head>
<body>
<p>
This is a test XHTML 1.0 Strict document.
</p>
<p>
See: <a href="./">W3C Markup Validation Service: Tests</a>
<b>huh
Well, isn't that good
</p>
<hr />
<address>
<a href="https://validator.w3.org/check?uri=referer">valid HTML</a><br />
<a href="../../feedback.html">Gerald Oskoboiny</a>
</address>
</body>
</html>
JimDabell - 10 hours ago
Or: <!doctype html>
<title>…</title>
<p>Important: Do <strongNOT</strong> come into the office tomorrow!
<!doctype html>
<title>…<title>
<p>Important: Do <strong>NOT</strong> come into the office tomorrow!
chrismorgan - 10 hours ago
Chromium displays this banner on top of the document up to the error: XML Parsing Error: mismatched tag. Expected: </b>.
Location: file:///tmp/x.xhtml
Line Number 22, Column 3:
</p>
--^
This page contains the following errors:
error on line 22 at column 5: Opening and ending tag mismatch: b line 19 and p
Below is a rendering of the page up to the first error.
eterm - 9 hours ago
chrismorgan - 9 hours ago
MangoToupe - 6 hours ago
archargelod - 12 hours ago
bawolff - 6 hours ago
bazoom42 - 13 hours ago
defanor - 13 hours ago
bazoom42 - 13 hours ago
crote - 11 hours ago
detaro - 10 hours ago
hulitu - 10 hours ago
pixl97 - 3 hours ago
lucketone - 10 hours ago
detaro - 10 hours ago
Eric_WVGG - 3 hours ago
thangalin - 13 hours ago
PaulRobinson - 13 hours ago
JimDabell - 10 hours ago
troupo - 7 hours ago
eloisant - 8 hours ago
otabdeveloper4 - 4 hours ago
bsimpson - 8 hours ago
troupo - 7 hours ago
giobox - 4 hours ago
troupo - 4 hours ago
donatj - 12 hours ago
detaro - 11 hours ago
chrismorgan - 10 hours ago
Eric_WVGG - 3 hours ago
MarsIronPI - 7 hours ago
bobmcnamara - 6 hours ago
kanwisher - 9 hours ago
le-mark - 10 hours ago
ethbr1 - 4 hours ago
mberning - 5 hours ago
elric - 11 hours ago
chrismorgan - 10 hours ago
JimDabell - 10 hours ago
detaro - 11 hours ago
Timwi - 11 hours ago
JimDabell - 10 hours ago
crote - 11 hours ago
Timwi - 11 hours ago
crote - 11 hours ago
BirAdam - 19 hours ago
mikewarot - 13 hours ago
vendiddy - 13 hours ago
socalgal2 - 18 hours ago
prisenco - 15 hours ago
LargoLasskhyfv - 13 hours ago
brap - 13 hours ago
aftergibson - 12 hours ago
gwbas1c - 4 hours ago
edanm - 17 hours ago
drnick1 - 18 hours ago
jwpapi - 18 hours ago
portaouflop - 11 hours ago
progval - 11 hours ago
spooky_deep - 14 hours ago
BirAdam - 7 hours ago
bdangubic - 18 hours ago
burnt-resistor - 18 hours ago
smrtinsert - 16 hours ago
burnt-resistor - 10 hours ago
DrewADesign - 8 hours ago
burnt-resistor - 6 hours ago
DrewADesign - 5 hours ago
zaptheimpaler - 13 hours ago
mikkupikku - 11 hours ago
snicky - an hour ago
eloisant - 8 hours ago
Minor49er - 5 hours ago
ethbr1 - 3 hours ago
acheron - 6 hours ago
9x39 - 4 hours ago
morshu9001 - 3 hours ago
big_toast - 5 hours ago
sarchertech - 5 hours ago
acidburnNSA - 4 hours ago
al_borland - 43 minutes ago
sen - 9 hours ago
neya - 12 hours ago
prox - 11 hours ago
Y-bar - 13 hours ago
KaiserPro - 11 hours ago
Y-bar - 5 hours ago
OtherShrezzing - 13 hours ago
GuB-42 - 11 hours ago
Y-bar - 7 hours ago
masfuerte - 6 hours ago
Y-bar - 6 hours ago
masfuerte - 6 hours ago
Y-bar - 5 hours ago
Sankozi - 13 hours ago
hulitu - 10 hours ago
achisler - 5 hours ago
bombcar - 9 hours ago
morshu9001 - 3 hours ago
donatj - 12 hours ago
watwut - 13 hours ago
Imustaskforhelp - 11 hours ago
PhilipRoman - 9 hours ago
Imustaskforhelp - 8 hours ago
zaptheimpaler - 3 hours ago
portaouflop - 11 hours ago
_bent - 7 hours ago
bobmcnamara - 7 hours ago
summa_tech - 5 hours ago
s3p - 7 hours ago
phire - 6 hours ago
bxparks - a day ago
brandonb927 - 20 hours ago
nine_k - 16 hours ago
perardi - 16 hours ago
nine_k - 15 hours ago
eloisant - 7 hours ago
benjaminwootton - 13 hours ago
ta12653421 - 2 hours ago
Spooky23 - 10 hours ago
huhkerrf - 15 hours ago
david_allison - 7 hours ago
shakna - 15 hours ago
nja - 19 hours ago
bxparks - 18 hours ago
lexicality - 13 hours ago
portaouflop - 11 hours ago
mikewarot - 13 hours ago
flancian - 7 hours ago
arkensaw - 21 minutes ago
tdeck - 17 hours ago
TheCapeGreek - 16 hours ago
dwayne_dibley - 14 hours ago
daxfohl - 19 hours ago
bdangubic - 18 hours ago
bxparks - 18 hours ago
marttt - 16 hours ago
vunuxodo - 6 hours ago
daxfohl - 17 hours ago
bdangubic - 9 hours ago
kirubakaran - 9 hours ago
bdangubic - 9 hours ago
electroglyph - 11 hours ago
bigthymer - 18 hours ago
socalgal2 - 18 hours ago
saagarjha - 10 hours ago
forever_frey - 11 hours ago
bapak - 16 hours ago
serial_dev - 12 hours ago
bxparks - 18 hours ago
iamacyborg - 14 hours ago
rgblambda - 3 hours ago
eloisant - 7 hours ago
Hobadee - 6 hours ago
tomComb - 19 hours ago
chrismorgan - 10 hours ago
socalgal2 - 18 hours ago
bxparks - 18 hours ago
hshdhdhj4444 - 15 hours ago
hshdhdhj4444 - 15 hours ago
nine_k - 17 hours ago
andsoitis - 16 hours ago
dlcarrier - 14 hours ago
serial_dev - 12 hours ago
dheera - 18 hours ago
bxparks - 18 hours ago
dheera - 17 hours ago
w10-1 - 16 hours ago
Gud - 14 hours ago
dlcarrier - 13 hours ago
Havoc - 12 hours ago
Findecanor - 12 hours ago
tester756 - 13 hours ago
veqq - 15 hours ago
stanac - 13 hours ago
goku12 - 12 hours ago
LargoLasskhyfv - 13 hours ago
myself248 - 11 hours ago
threemux - 7 hours ago
evgen - 8 hours ago
whalesalad - 5 hours ago
tptacek - 6 hours ago
neoCrimeLabs - 7 hours ago
dannyobrien - a day ago
ripley12 - 16 hours ago
sauercrowd - 14 hours ago
mike_hearn - an hour ago
drnick1 - 18 hours ago
justinclift - 8 hours ago
coreyhn - 18 hours ago
gregsadetsky - 9 hours ago
bapak - 16 hours ago
viraptor - 13 hours ago
benrutter - 17 hours ago
Havoc - 12 hours ago
Fuzzwah - 13 hours ago
Towaway69 - 13 hours ago
tclancy - 5 hours ago
pyromaker - 13 hours ago
la_fayette - 12 hours ago
kmoser - 5 hours ago
haunter - a day ago
bapak - 16 hours ago
saurik - 16 hours ago
edent - 10 hours ago
bazmattaz - 6 hours ago
geor9e - a day ago
burnt-resistor - 18 hours ago
tdeck - 17 hours ago
heylook - an hour ago
joshdavham - a day ago
lazyasciiart - 18 hours ago
geoffpado - 18 hours ago
quinndexter - 11 hours ago
educasean - 2 hours ago
daxfohl - 21 hours ago
samrolken - 19 hours ago
esperent - 18 hours ago
morshu9001 - 3 hours ago
robertakarobin - 19 hours ago
michaelcampbell - 9 hours ago
einsteinx2 - 7 hours ago
SeanAnderson - 14 hours ago
reassess_blind - 10 hours ago
gregsadetsky - 7 hours ago
znpy - 13 hours ago
Havoc - 12 hours ago
chadcmulligan - 9 hours ago
dlcarrier - 6 hours ago
eloisant - 7 hours ago
le-mark - 9 hours ago
badsectoracula - 7 hours ago
Havoc - 9 hours ago
detaro - 9 hours ago
chadcmulligan - 9 hours ago
rs186 - 8 hours ago
thinkingemote - 13 hours ago
pzmarzly - 8 hours ago
cube00 - 13 hours ago
ivanjermakov - 12 hours ago
cube00 - 9 hours ago
znpy - 13 hours ago
donatj - 11 hours ago
mattkevan - 9 hours ago
wfn - 4 hours ago
AnonC - 15 hours ago
goku12 - 12 hours ago
kentonv - 7 hours ago
Geee - 11 hours ago
snovymgodym - a day ago
ghssds - a day ago
6SixTy - 35 minutes ago
bitwize - 15 hours ago
znpy - 12 hours ago
ranma42 - 12 hours ago
cardanome - 9 hours ago
justinclift - 8 hours ago
alganet - 9 hours ago
Analemma_ - a day ago
dlcarrier - 14 hours ago
Qem - 10 hours ago
Austizzle - 9 hours ago
pessimizer - 9 hours ago
restalis - 38 minutes ago
Towaway69 - 17 hours ago
feketegy - 13 hours ago
donatj - 11 hours ago
dannersy - 12 hours ago
username223 - 8 hours ago
username223 - 14 hours ago
Towaway69 - 14 hours ago
Fuzzwah - 13 hours ago
amanwithnoplan - 2 hours ago
habosa - 7 hours ago
MilanTodorovic - 13 hours ago
Fuzzwah - an hour ago
piskov - 17 hours ago
drnick1 - 17 hours ago
Rohansi - 17 hours ago
motorest - 15 hours ago
Rohansi - 4 hours ago
piskov - 12 hours ago
motorest - 11 hours ago
nurbl - 9 hours ago
adabyron - 7 hours ago
piskov - 12 hours ago
ugh123 - 17 hours ago
cube00 - 13 hours ago
klabetron - 13 hours ago
umanwizard - 16 hours ago
frou_dh - 8 hours ago
tjpnz - 10 hours ago
Wistar - 36 minutes ago
zyklonix - 16 hours ago
dlcarrier - 14 hours ago
rzzzt - 13 hours ago
tech234a - 3 hours ago
Terretta - 4 hours ago
alance - 13 hours ago
al_borland - 34 minutes ago
noveltyaccount - an hour ago
dewey - 12 hours ago
youngtaff - 11 hours ago
csense - an hour ago
matt_heimer - an hour ago
gmuslera - a day ago
tehdely - 41 minutes ago
heavyset_go - 14 hours ago
JoshTriplett - 18 hours ago
bitwize - 15 hours ago
alance - 13 hours ago
ajot - a day ago
estebarb - 5 hours ago
goddaneel - 5 hours ago
javier2 - 6 hours ago
Lerc - a day ago
hamdingers - 17 hours ago
znpy - 12 hours ago
toast0 - 17 hours ago
GeneralMaximus - 14 hours ago
Linux-Fan - 3 hours ago
Corrado - 9 hours ago
zweifuss - 9 hours ago
Hobadee - 6 hours ago
kesor - 13 hours ago
funflame - 9 hours ago
donatj - 11 hours ago
gmac - 12 hours ago
FergusArgyll - 11 hours ago
mrec - 9 hours ago
mock-possum - 12 hours ago
pessimizer - 9 hours ago
mock-possum - 4 hours ago
pessimizer - 2 hours ago
dunham - 18 hours ago
cobertos - 17 hours ago
cmontella - 6 hours ago
KingMob - 10 hours ago
veqq - 15 hours ago
cmontella - 5 hours ago
crowdhailer - 4 hours ago
addaon - 14 hours ago
Hobadee - 6 hours ago
kesor - 13 hours ago
silcoon - 11 hours ago
Marsymars - 5 hours ago
stn8188 - 2 hours ago
Chanderton - 8 hours ago
gregsadetsky - 7 hours ago
ofalkaed - a day ago
marttt - 16 hours ago
dizhn - 11 hours ago
ofalkaed - 3 hours ago
evbogue - a day ago
myself248 - 11 hours ago
v3ss0n - 17 hours ago
znpy - 12 hours ago
lifty - 10 hours ago
pessimizer - 9 hours ago
evbogue - 2 hours ago
lifty - 8 hours ago
znpy - 7 hours ago
kesor - 13 hours ago
eszed - 4 hours ago
rkomorn - 13 hours ago
bdcravens - 15 hours ago
alex7o - 14 hours ago
cr125rider - a day ago
netsharc - a day ago
bapak - 16 hours ago
bitwize - 15 hours ago
JoeyJoJoJr - 13 hours ago
iambateman - 19 hours ago
socalgal2 - 18 hours ago
Froedlich - 19 hours ago
glitchc - 17 hours ago
ValdikSS - 13 hours ago
PufPufPuf - 13 hours ago
patapong - 16 hours ago
homarp - 15 hours ago
stavros - 13 hours ago
jamesu - 11 hours ago
paride5745 - 8 hours ago
exp1orer - a day ago
ofalkaed - a day ago
khaledh - a day ago
alexeldeib - 21 hours ago
valorzard - 18 hours ago
alexeldeib - 17 hours ago
iseanstevens - 10 hours ago
tmtvl - 15 hours ago
silcoon - 11 hours ago
tmtvl - 3 hours ago
friendofafriend - 18 hours ago
MarsIronPI - 3 hours ago
jzellis - a day ago
uzername - 18 hours ago
stavros - 13 hours ago
hshdhdhehd - 15 hours ago
indy - 13 hours ago
donatj - 11 hours ago
spooky_deep - 14 hours ago
sauercrowd - 14 hours ago
fodkodrasz - 12 hours ago
spooky_deep - 3 hours ago
acemarke - 6 hours ago
zem - 14 hours ago
PufPufPuf - 13 hours ago
dpcan - a day ago
MontyCarloHall - a day ago
donatj - 11 hours ago
bapak - 16 hours ago
ssss11 - 13 hours ago
lentil_soup - 12 hours ago
gorfian_robot - 6 hours ago
Wistar - 43 minutes ago
al_borland - 27 minutes ago
kristianc - a day ago
walterbell - a day ago
ahartmetz - 21 hours ago
KaiserPro - 11 hours ago
diffeomorphism - 14 hours ago
emigre - 12 hours ago
latexr - 7 hours ago
_bent - 7 hours ago
latexr - 7 hours ago
gimenete - an hour ago
zyklonix - 16 hours ago
LVB - 16 hours ago
ebbi - an hour ago
carlesfe - 7 hours ago
zem - 14 hours ago
commandersaki - 18 hours ago
LargoLasskhyfv - 12 hours ago
megous - 11 hours ago
heavyset_go - 14 hours ago
lormayna - 8 hours ago
morshu9001 - 3 hours ago
al_borland - 23 minutes ago
mikewarot - 20 hours ago
rasengan0 - 16 hours ago
viraptor - 16 hours ago
G_o_D - 11 hours ago
teo_zero - 5 hours ago
hyperific - 15 hours ago
dlcarrier - 14 hours ago
arjvik - 14 hours ago
mount -t tmpfs ram /mnt/ramdisk
dlcarrier - 13 hours ago
robotswantdata - 15 hours ago
carstenhag - 14 hours ago
LargoLasskhyfv - 12 hours ago
syncr0 - 3 hours ago
countrymile - a day ago
silisili - 13 hours ago
spike021 - 13 hours ago
Jordan-117 - 16 hours ago
Rohansi - 15 hours ago
chanux - 15 hours ago
esafak - 6 hours ago
manmal - 12 hours ago
chucky_z - 4 hours ago
MomsAVoxell - 7 hours ago
mbirth - 20 hours ago
harel - 18 hours ago
JMiao - 15 hours ago
alisonatwork - 14 hours ago
cmrdporcupine - 16 hours ago
KronisLV - 12 hours ago
kimchidude - 8 hours ago
holysantamaria - a day ago
daxfohl - 21 hours ago
themerone - 17 hours ago
tuna74 - 6 hours ago
b33f - 13 hours ago
rhodey - a day ago
zdragnar - 19 hours ago
pflenker - 11 hours ago
walterbell - a day ago
kesor - 13 hours ago
kstrauser - 6 hours ago
ozgor - 8 hours ago
jcastro - a day ago
Blackstrat - an hour ago
nickthegreek - a day ago
antod - 10 hours ago
Froedlich - 19 hours ago
BirAdam - 19 hours ago
burnt-resistor - 18 hours ago
smokel - 11 hours ago
zzo38computer - 3 hours ago
devl547 - 6 hours ago
kesor - 13 hours ago
Aldipower - 10 hours ago
TriangleEdge - 8 hours ago
kesor - 13 hours ago
raphinou - 4 hours ago
FranklinMaillot - 9 hours ago
jFriedensreich - 9 hours ago
Angostura - 8 hours ago
Borg3 - 13 hours ago
protocolture - 15 hours ago
linguae - 16 hours ago
mike_hearn - 25 minutes ago
raphman - 9 hours ago
w10-1 - 16 hours ago
lapsed_lisper - 11 hours ago
silcoon - 11 hours ago
kurtis_reed - 21 hours ago
Froedlich - 19 hours ago
fodkodrasz - 12 hours ago
Findecanor - 12 hours ago
mwpmaybe - 21 hours ago
ibaikov - 7 hours ago
MomsAVoxell - 7 hours ago
ghaering - 12 hours ago
hardwarepirate - 13 hours ago
hardwarepirate - 13 hours ago
abeyer - 12 hours ago
KnuthIsGod - 14 hours ago
can16358p - 12 hours ago
burnt-resistor - 18 hours ago
cmrdporcupine - 16 hours ago
zem - 14 hours ago
ggm - 19 hours ago
thequux - 15 hours ago
Simon_O_Rourke - 11 hours ago
rishabhd - 11 hours ago
rickette - 16 hours ago
jmclnx - 20 hours ago
chipx86 - 15 hours ago
Froedlich - 19 hours ago
gorfian_robot - 6 hours ago
croisillon - 12 hours ago
pzo - a day ago
Froedlich - 19 hours ago
dheera - 18 hours ago
Marsymars - 5 hours ago
lynx97 - 13 hours ago
1970-01-01 - 7 hours ago
kstrauser - 6 hours ago
1970-01-01 - 5 hours ago
kstrauser - 4 hours ago
speed_spread - 14 hours ago
jfadfwddas - 4 hours ago
Razengan - 7 hours ago
anticodon - 8 hours ago
Joel_Mckay - 9 hours ago
teddyh - 7 hours ago
fennecbutt - a day ago
spooky_deep - 13 hours ago
JMiao - 15 hours ago
toast0 - 17 hours ago
nickthegreek - a day ago
shikon7 - 18 hours ago
bdangubic - 18 hours ago
latexr - 6 hours ago
csomar - 10 hours ago
lyu07282 - 4 hours ago
smashah - 10 hours ago
symlinkk - an hour ago
satisfice - 16 hours ago
latexr - 6 hours ago
satisfice - an hour ago
syngrog66 - 10 hours ago
DaSHacka - 9 hours ago
ofalkaed - an hour ago
UD_Pickups - 10 hours ago
JimDabell - 18 hours ago
JoshTriplett - 18 hours ago
btown - 18 hours ago
JimDabell - 18 hours ago
btown - 15 hours ago
JimDabell - 14 hours ago
JoshTriplett - 17 hours ago
JimDabell - 16 hours ago
JimDabell - 18 hours ago
mixmastamyk - 15 hours ago
JimDabell - 14 hours ago
mixmastamyk - 5 hours ago
drnick1 - 18 hours ago
eimrine - 12 hours ago
eviks - 14 hours ago
JimDabell - 13 hours ago
eviks - 13 hours ago
JimDabell - 10 hours ago
pessimizer - 9 hours ago
JimDabell - 8 hours ago