Lets make Friday 'Joke Day'!

What is this?:

Clip….clap….clip….clap….clip….clap…

BANG BANG….BANG

CLIP CLAP CLIP CLAP CLIP CLAP!!!



It’s an Amish drive by shooting
 
View attachment 131956

XKCD’s insight is awesome.
Oof. And they're right, too. I once had a client that had to electronically bill their largest customer via an arcane standard developed in house by said customer. They had to write custom software to do the billing, and they put it on a server and it happily ran for many years.

Then, their CIO realized that something like 40% of the company's revenue was dependent on said system and decided it was important to modernize it and have backups in place... And they literally couldn't find the server. They had moved all of their infrastructure from their on-site data center to the cloud years before and the former data center was mostly converted to a training room, and it wasn't there, and anybody who'd had physical access to it had long moved on to greener pastures. So, the CIO decided this was pretty much an emergency at this point as they were one dead hard drive or power supply or motherboard away from a financial near-disaster.

Luckily, they could still connect to it and download the code from it, and we got hired to pick apart the code and write a new system based off of it. Partway into the project, we discovered that the server wasn't even doing everything itself, that when it was put in place it still had to do a couple of things by connecting to an even older server that was from 30ish years ago that they swore up and down at the beginning of the project didn't even exist any more. Someone probably drywalled over both of them years ago... And we could not connect to the older one ourselves, it was thought that the older server may have even predated the invention of Ethernet. So, it was a black box and we thus couldn't reproduce what it was doing, which was integral to the process.

So, the project was quietly terminated, and the CIO went and got himself another job so he wouldn't be around when it finally went down. :eek:
 
Paris Olympics Pistol shooting competion. Turkey team wins 2nd place silver. Guy looks cool and relaxed compared to other competitors.

View attachment 132002

View attachment 131999
The craziest thing about that dude... He didn't get the silver medal on his own. He got 13th in the individual competition.

He got the silver medal for the mixed doubles, along with a 24-year-old woman, Sevval Ilayda Tarhan, who shot just like him:

1725656765953.png

She got 7th in the individual competition, he got 13th... So it's kinda surprising we never heard about her at all! :eek:
 
The craziest thing about that dude... He didn't get the silver medal on his own. He got 13th in the individual competition.

He got the silver medal for the mixed doubles, along with a 24-year-old woman, Sevval Ilayda Tarhan, who shot just like him:

View attachment 133222

She got 7th in the individual competition, he got 13th... So it's kinda surprising we never heard about her at all! :eek:
That picture is even more cool than the one of just him. Those look like two people I’d like to hang out with. Have some beers, shoot some guns, talk about flying, …
 
This morning I was sitting on the porch drinking coffee in my slippers.
I really need to wash some dishes.
This guy used to drink champainge from his shoe after winning a race. Occasionally he would coerce other people on the podium to drink from the same shoe.
1725706131279.png
 
That picture is even more cool than the one of just him. Those look like two people I’d like to hang out with. Have some beers, shoot some guns, talk about flying, …
Substitute the word beers with steaks, and that was my night last night!
 
Oof. And they're right, too. I once had a client that had to electronically bill their largest customer via an arcane standard developed in house by said customer.

And they literally couldn't find the server. They had moved all of their infrastructure from their on-site data center to the cloud years before

Luckily, they could still connect to it and download the code from it,

This is crazy to me. If you can connect, you have a network address, now you can get the Mac from the arp table, then look on the network equipment to find the cable and trace the cable.
 
Then, their CIO realized that something like 40% of the company's revenue was dependent on said system and decided it was important to modernize it and have backups in place... And they literally couldn't find the server. They had moved all of their infrastructure from their on-site data center to the cloud years before and the former data center was mostly converted to a training room, and it wasn't there, and anybody who'd had physical access to it had long moved on to greener pastures.
I worked at at one of the big biotech firms some years back, and was involved in a project to put in a new inventory system for all of the -80C and LN2 (-196C) freezers. The Crown Jewels of a biotech are the cell lines; thousands and thousands of irreplaceable cell lines. Ours were known to be in 18 freezers in secure locations scattered across one big building

Except when we started the project, no one could find one of the -80C freezers. We had the old inventory for it. There were people who could remember pulling cell lines from it in the past. But no one recalled exactly where it was, and by the time I left for greener pastures, it still hadn't been located.
 
This is crazy to me. If you can connect, you have a network address, now you can get the Mac from the arp table, then look on the network equipment to find the cable and trace the cable.
Agreed. It takes some detective work by an actual network guy, but you should be able to find it…
 
This is crazy to me. If you can connect, you have a network address, now you can get the Mac from the arp table, then look on the network equipment to find the cable and trace the cable.
You would think... I wasn't in on that part of the conversation, but either they no longer had good enough networking people to do so, or (more likely) said cable went into a rat's nest under the floor and/or into parts unknown in the rest of the building, and either it wasn't labeled or whatever scheme they used to label things was no longer known to the IT staff present, which had completely turned over more than once since this thing was spun up. Or both.

Is there a way to physically trace a twisted pair Ethernet cable while it's still plugged in and operational?
 
This is crazy to me. If you can connect, you have a network address, now you can get the Mac from the arp table, then look on the network equipment to find the cable and trace the cable.
:) You're young or lucky enough to have never worked in a large building filled with un-managed switches. Or hubs!! Looking at a bundle of cable as big around as your arm going into a jagged hole in the drywall, and from there you can see maybe some goes up, some goes down, but nobody knows where. Or an old computer room where there's so much cabling under the floor that some of the raised tiles are pushed up above the grid. That one kinda gives me cold chills, used to work in at a place like that.

Even more fun are the servers where people know where they are, but they don't know if they'll start. That conversation sometimes goes like this "Oh, we'll just power it down and move it to the new building and then start it up." When I then ask "when's the last time that was restarted, and is any of it under support at all?" Meaning if the bios/firmware doesn't remember how the drives work, or the boot track of the HD's are gone, it might just not start and it might not be recoverable. It's been a while, but as recent as 10 years ago I've seen servers that haven't been restarted in 10+ years.

Ok, this was "inside sad joke" not really joke. I'll try to fix it - what's the difference between a used car salesman and an IT salesman? The used car salesman knows when he's lying to you.
 
:) You're young or lucky enough to have never worked in a large building filled with un-managed switches. Or hubs!! Looking at a bundle of cable as big around as your arm going into a jagged hole in the drywall, and from there you can see maybe some goes up, some goes down, but nobody knows where. Or an old computer room where there's so much cabling under the floor that some of the raised tiles are pushed up above the grid. That one kinda gives me cold chills, used to work in at a place like that.

Even more fun are the servers where people know where they are, but they don't know if they'll start. That conversation sometimes goes like this "Oh, we'll just power it down and move it to the new building and then start it up." When I then ask "when's the last time that was restarted, and is any of it under support at all?" Meaning if the bios/firmware doesn't remember how the drives work, or the boot track of the HD's are gone, it might just not start and it might not be recoverable. It's been a while, but as recent as 10 years ago I've seen servers that haven't been restarted in 10+ years.

Ok, this was "inside sad joke" not really joke. I'll try to fix it - what's the difference between a used car salesman and an IT salesman? The used car salesman knows when he's lying to you.
Netware 3.x servers were notorious for this: they had a startup script that essentialy ran console commands. But the servers would run for years without requiring a restart (which is impressive in its own way) and folks would run console commands to load new stuff but never add it to the startup script. Reboot and nothing works. Nobody remembrs all the things they added from the console. Happened alot in the 90's
 
Backups? We don’t need no stinkin’ backups
 
I’m embarrassed to admit that it took me a few minutes
 
Netware 3.x servers were notorious for this: they had a startup script that essentialy ran console commands. But the servers would run for years without requiring a restart (which is impressive in its own way) and folks would run console commands to load new stuff but never add it to the startup script. Reboot and nothing works. Nobody remembrs all the things they added from the console. Happened alot in the 90's
A former co-worker has to deal with a funny problem with Netware. The main display screen would crash on starting it. Eventually tracked the issue down to the fact, the screen had a display which showed number of days since restart. It was only three digits, when uptime exceeds 999 days, the screen would crash.


Tim
 
:) You're young or lucky enough to have never worked in a large building filled with un-managed switches. Or hubs!! Looking at a bundle of cable as big around as your arm going into a jagged hole in the drywall, and from there you can see maybe some goes up, some goes down, but nobody knows where. Or an old computer room where there's so much cabling under the floor that some of the raised tiles are pushed up above the grid. That one kinda gives me cold chills, used to work in at a place like that.

Even more fun are the servers where people know where they are, but they don't know if they'll start. That conversation sometimes goes like this "Oh, we'll just power it down and move it to the new building and then start it up." When I then ask "when's the last time that was restarted, and is any of it under support at all?" Meaning if the bios/firmware doesn't remember how the drives work, or the boot track of the HD's are gone, it might just not start and it might not be recoverable. It's been a while, but as recent as 10 years ago I've seen servers that haven't been restarted in 10+ years.

Ok, this was "inside sad joke" not really joke. I'll try to fix it - what's the difference between a used car salesman and an IT salesman? The used car salesman knows when he's lying to you.
I've done both. Worked at a large online brokerage that had grown very rapidly, and everything had to be done NOW with no downtime (and often little or no planning). Masses of cables with zero organization or documentation. The company's public facing DNS server (singular) was a long-obsolete SparcStation 5 plugged into a 10 Mbps hub. Not switch. Servers, all of which were Sun hardware, were "managed" by telnet (no, not ssh) or more often from a couple of long tables filled with monitors and keyboards, attached to 4- and 8-way KVM switches. No two servers were alike, and if one went down we were guaranteed that either a website or some function went with it. Tracing cables was difficult, but we still did it. Oh, and the entire core data center network was ATM...

I did a lot of work there; it was a target-rich environment. When I left most things were running on identical Linux servers in two geographically diverse data centers, and you could theoretically walk through and randomly pull power or Ethernet cables and no customer would ever notice. Nobody set foot in a computer room except the data center guys, for cabling or install/deinstall work.

Then I went to work for a large regional bank, which has long since been absorbed by one of the big 4. We would occasionally find a Sun server running under someone's desk... or a "pizza box" workstation doing something important in production. Old habits die really, really hard.
 
Back
Top