top of page
Writer's pictureTactical Capacitor

USB 3.1 and Its Anti-Intuition Gatekeeping (+ Others like Displayport and DVI)

Updated: Feb 11, 2021




When it comes to the Universal Serial Bus standard, there seems to have been a massive cultural shift in leadership at the USB-IF (the foundation that creates the standards) since version 3.1. All new versions since 3 sound like they were designed by the self-important "if you don't know how to do it, don't bother, you'll just break it" or "god... give me the keyboard I'll fix it" (rather than offering to ASSIST or explain anything), on the spectrum, gatekeeping IT nerds who find happiness in being condescending to the people they are supposed to help. We all work with them in IT, they dominate the industry, you may even be one yourself (but I hope not). It's a standard that makes no sense. That any asshole IT guy could go "ahhhh seriously? It's so simple, go google it you idiot", but any normal human being goes "what the hell is it named like this for?".


USB 1.0 (12Mbps), 2.0 (480Mbps), and 3.0 (5Gbps) were all the standards that made USB such a mainstream, revolutionary standard. Easy to understand, easily marketed, a consolidated interface that made for a universal physical and protocol standard and replaced highly unintuitive Serial/Parallel/DB port types.


Then some Sheldon-from-BigBangTheory nerd must have been promoted just to talk down to the masses and make himself feel great. USB 3.1 (10Gbps)... almost made sense. Same electrical compatibility as 3.0, but has double the bandwidth. Cool. It could easily have been USB 4.0, but I get it. But THEN, you add 'gens' to it. 3.1 Gen1 = 3.0 (5Gbps), 3.1 Gen2 = 3.1 (5Gbps)?? I'm sorry, the average user has a hard enough time knowing what USB is to begin with, let alone 2 vs 3. NOW you create some insane esoteric standard that even savvy IT guys have to look to a decoder wiki to understand? But of course, the creators know what it stands for, so us plebes are just stupid for not just naturally knowing. This is the personality and mindset of a Sheldon IT guy.


There's some absurd, unintuitive technical answer for this, being that it's not technically 4.0 since it uses the same electrical pinout or something... but USB 1 and 2 didn't abide by that ridiculousness. This is nothing short of gatekeeping. Of "I'm smarter than you, this is so easy, why don't you get it?" IT mentalities. Nevermind making any sense, never mind how human beings think, just "I know it (now), why don't you ya dummy?".


But then the Sheldons must have let the power go to their head at USB-IF (USB Implementers Forum, the standard makers). USB 3.2 comes out, with off-road names to complicate the mess to a more obscure, less adoptable, less marketable, less understandable place. We get:


USB 3.2 1x1 = 5Gbps (USB 3.0/3.1 Gen1)

USB 3.2 2x1 = 10Gbps (USB 3.1 Gen2)

USB 3.2 1x2 = 10Gbps over x2 5Gbps lanes (USB-C)

USB 3.2 2x2 = 20Gbps over x2 10Gbps lanes (USB-C)


This is how super nerds develop standards that say "duhh, of course they make sense" to them, so they can all act as those arrogant gatekeepers. The standards are ridiculous, no common sense at all. BUT IT SEEMS SHELDON HAS BEEN FIRED, because USB 4.0 goes back to the roots! USB 3.1-3.2 weren't even out for more than about 6 years. So many standards, so little time to bother with.


So USB 4.0 is a USB-C standard, and does 40Gbps! Finally common sense is back! So common sense that Apple themselves are spearheading the adoption by explicitly implementing USB 4.0 USB-C ports on their new ARM silicon Macbooks for 2020. This is where we should be. Sheldons are a plague on the IT industry. I've had to work with so many of these assholes at all levels, that I am so happy we've finally turned a page and gotten rid of them at this important level of technology.


But, Sheldons persist in other forms...


The Displayport standard seems to be another area the super nerdsthat hate normal human beings dominate, too. This wikipedia article explains it all. 1.0, 1.1, 1.2, 1.4, 1.4a, and 2.0, are just as arbitrary and absurd as each other. 2.0 has no right becoming its own version if 1.2 and 1.4 didn't themselves. The physical standards, my god, how many peripherals and laptops do you own that have differing standards of Active DP, Passive DP, or MiniDP (in active or passive), combined with version compatibilities that prevent you from using one of your adapters, cables, or peripherals and successfully making a connection?


You want G-Sync (or Freesync)? Ooooh, you better have a display that can run DP 1.2, set your monitor to DP 1.2 mode, get the right cable, and activate the function in Nvidia or AMD video control panels. Make sure you learn the differences between Active and Passive cables and their nuanced applications too. You want to set up multiple monitors on your Microsoft Surface, or other physical interface-limited device? GOOOOOOD LUCK on figuring out the right combination of adapters, cables and monitors compatible with daisy-chaining their signals to run multiple monitors over a single source connection. Oh, need to run a DP cable through a tight fitting desk? Uh oh, look out for the incredibly thick cable variants, the giant boots with locking fingers and overlap other usable ports on your GPU or monitor. DP is the bane of an IT guy's existence.

^top to bottom: DVI-I Single Link, DVI-I Dual Link, DVI-D Single, DVI-D Dual, DVI-A


These Sheldons seem to have all migrated from the DVI standard though, which makes sense. DVI incorporates a similar looking cable, color scheme and pinout, to a vastly different compatibility table. If your GPU or monitor has DVI, you can't rely on a white cable that looks like DVI (just like VGA looks like VGA), you'll have to consult a cheat sheet or be a super-nerd that feels like this is important knowledge to have in their brains and there's nothing wrong here. DVI-D (digital), DVI-I (integrated, aka analog and digital in one standard), DVI-A (analog), single and dual link (relevant to total bandwidth)... and then even Mini and Micro DVI standards.


As a normal human, even a gigantically nerdy one, if you think for yourself and have empathy for normal humans and a knack for intuition (Apple is a double-edged, but good, example of that), you'll probably agree with me on these standards. It's not a fair or common sense way to approach anything in the world. If you were a, say, mechanic, and insisted and making standards that relied on unintelligible, esoteric, proprietary naming schemes, you wouldn't last long. Imagine someone calling a small block V8 engine an 'offset tubular low-volume, combustible eight-cylinder block' and a slightly bigger block a 'large capacity combustible multi-cylinder unit'. Neither of them sound relatable, you can't compare them, they make little to no sense. They sound similar but in a random way. It's awful. And it's what sucks about the IT industry being dominated by people that blow themselves up to be smarter than others, and use these tactics to prove so- even at high levels like the ones mentioned above.


Anyway though, the common sense guys exist, and it seems they exist in similar numbers because we have standards like HDMI and USB 4.0 out there too. If you agree or disagree, please leave a comment or send a message to this site's chat so I can understand the landscape a little better. Thanks for reading this far!


57 views0 comments

Комментарии


bottom of page