Up until now most posts on this blog have been relevant to hacking, and some about being an IT professional. Well most of both are about knowledge, and knowledge is gained through trial and error. Truthfully much more error than success leads to a very triumphant outcome. As I personally believe stories of personal failures and successes aid for the betterment of the community as a whole, thought might as well share some stories.
Friday, day is starting off fine, checking a few emails on the local exchange server. After planning for a long awaited hosted exchange migration, the goal of the day was to attach mailboxes to my outlook, archive emails, delete mailbox. The hosted exchange was going to charge by the mailbox, so those outdated old mailboxes of employees from yesterday serve no place in our new system. One mailbox is attached, running an archive when suddenly... outlook says "disconnected".
Crap what the hell? Look down and... this computer is connected to the network, open up cmd and type in "net view" to see if I really am online and... I am. Which means... worse things are occurring. Keys to the server closet were yanked from my desk and I ran down the hall, well briskly walked really, throw the door open and the screen to the exchange server is black. Not blue, black, nothing, nodda, zilch, nothing on that screen at all. This is our email server, life blood to some of the employees here, without this they are disconnected from clients, which means no money, which means not good. Time for the first trial.
Hold down the power button, watch every light go off, and click the power button once, and wait. Small blinking red light taunting me on the front saying "you know I'm broken" in a subtle angry fashion. I know what blinking red means, I know you can circumvent it as well if you reset the motherboard. Unplug the redundant PSU's, hold the power button, clear the system of power, take off the side panel, remove the motherboard battery, everything empty. The system is fresh, any memory of itself from a prior. Press that power button once more, the back fan flinches, the lights on the motherboard light up, and that red light blinks again. Denial sinks in. The previously mentioned tasks of unplug power, remove CMOS battery, rinse washed and repeated five to ten times before defeat is admitted and the towel is thrown in.
The time is 8:00am, we've been open since 7:30am, it is Friday... best day and time, to have a catastrophic melt down prior to a hosted exchange migration. Luckily for us our MSP has a backup of our servers, should be up momentarily in a virtual format, didn't realize we had a hypervisor in the office, but that's fantastic. MSP says they'll work as fast as they can. An hour goes by... not a word... two hours goes by... and the phone rings. The images in question? Full and incremental backup chains being compiled by software, its going to take at least four to six more hours before its back up and running.
Over the past two hours I have been bugged, pestered, poked, prodded, by everyone in the building asking when everything was going to be fixed. They needed their email, they needed to contact clients, they needed to be contacted by clients... and now I must deliver a killing blow as if I were Kano from Mortal Kombat, removing their still beating hearts from their chests and telling them they must wait even longer... maybe not even until Monday.
The utter feeling of defeat looms over for the remainder of the day, patiently waiting the phone to ring, with the MSP on the other end of the line telling me we are good to go. 4:22pm the phone rings, can't promise it'll be back up by 4:30, which is the time I leave, but it'll be back up soon. I voluntarily surrender my personal cellphone number in case of any emergency, I have faith in these guys, they'll have it back up and running soon.
Clock hits 4:30... wait an additional five just in case, no rings, I'll be alerted when everything is working, might as well get on my way. Oh the traffic... bumper to bumper to bumper, lord was it an amazing end to such a beautiful day... and the phone rings. Thank $DEITY everything will be fine, or at least in my mind it will be. "We need a CD key for the server, the one it grabbed isn't working." its 5:00, I've sat in 30 minutes of traffic... to be right next to the last possible exit that I could turn around at... I have no choice, has to be done. Quick turn around and headed back to work, at least this time I know which route not to take headed home. After some more bumper to bumper, 3 turns before I'm back at work... phone rings again "oh turns out we called it in to Microsoft and its fine" I'm not happy... There is a bottle of wine calling my name back at home.
I make one more phone call on the way home, and updated quote from the new email host, I'm tired of dealing with this failing hardware, this is going to be done on Monday morning.
Saturday morning, the wife heads to work, I roll over to look at any G+ updates, any emails, anything of note and see I have a voice mail. Hmm its 9:00 in the morning on a Saturday, who called me?... The MSP... "the host server went down at 3:00 this morning" the host server? I'm confused before they even finish their voice mail, what do you mean my host server went down? The hypervisor failed? How? ESXI went down? I don't follow. Hung up the voice mail for a second, to gather my thoughts as I stopped paying attention the second they said the host server failed. Redialed the voice mail to really pay attention, what exactly is going on. "all you have to do is login to the server, and open virtual box".... I have to open virtualbox? The verbiage made it sound as if I didn't know what he was talking about, like this was a new lesson, its not I am very familiar with virtualbox BUT. Why is my exchange server living and existing inside of a type 2 hypervisor. It's stability is limited by the stability of the host system which... isn't proving to be all that great.
I did what had to be done, drove in Saturday morning in my finest apparel. Booted up the server, enabled the VM, enabled all the exchange services, make sure they were up and running. And of course figure out what caused the crash. Good ole event viewer, my dearest friend, lets see what you have in store for us today. I expect nothing crazy, disk drive failure, memory blip, something simple that can be ignored. A one time failure we'll never see again, something just oh so generic that wasting my time trying to solve it really proves nothing. That was the existing hope, but not the factual reality. A software proprietary to our MSP crashed shortly after loading, causing a BSOD. This is the server they run, they maintain, its not mine to mess around with all willy nilly, so I did the appropriate thing and called, left a voice mail... but never heard back.
Sunday, attempted to log into the email at 9:00am again... failure... had to replay the games from yesterday. Same cause of failure, same situation, still no response from the MSP, Monday morning will be fun.
Monday, server completely BSOD. Not just a blip and reboot, full fledged BSOD, staring, taunting, agonizing. The exchange migration I wanted to ease into is now going to be gunning it like a drag car, I have no choice I have to try to solve this quick and give the same announcement that the ever so precious email is still down. The stress, the pressure, the anger, all growing greatly from the complete failures of multiple systems. Rebooted the server 3 times, each time resulting in a BSOD after 3 minutes of up time. Finally safe mode was engaged, msconfig ran, and anything pertaining to the MSP was disabled... one problem has seemingly been solved.
Remainder of the week will be dedicated to getting everyone's mailbox switched over to the hosted Exchange.
I am hungry, I am tired, I am caffeinated, I am annoyed, I am stressed, I am worried, I am IT. Until next time, be safe my goblins.
Over the past two hours I have been bugged, pestered, poked, prodded, by everyone in the building asking when everything was going to be fixed. They needed their email, they needed to contact clients, they needed to be contacted by clients... and now I must deliver a killing blow as if I were Kano from Mortal Kombat, removing their still beating hearts from their chests and telling them they must wait even longer... maybe not even until Monday.
The utter feeling of defeat looms over for the remainder of the day, patiently waiting the phone to ring, with the MSP on the other end of the line telling me we are good to go. 4:22pm the phone rings, can't promise it'll be back up by 4:30, which is the time I leave, but it'll be back up soon. I voluntarily surrender my personal cellphone number in case of any emergency, I have faith in these guys, they'll have it back up and running soon.
Clock hits 4:30... wait an additional five just in case, no rings, I'll be alerted when everything is working, might as well get on my way. Oh the traffic... bumper to bumper to bumper, lord was it an amazing end to such a beautiful day... and the phone rings. Thank $DEITY everything will be fine, or at least in my mind it will be. "We need a CD key for the server, the one it grabbed isn't working." its 5:00, I've sat in 30 minutes of traffic... to be right next to the last possible exit that I could turn around at... I have no choice, has to be done. Quick turn around and headed back to work, at least this time I know which route not to take headed home. After some more bumper to bumper, 3 turns before I'm back at work... phone rings again "oh turns out we called it in to Microsoft and its fine" I'm not happy... There is a bottle of wine calling my name back at home.
I make one more phone call on the way home, and updated quote from the new email host, I'm tired of dealing with this failing hardware, this is going to be done on Monday morning.
Saturday morning, the wife heads to work, I roll over to look at any G+ updates, any emails, anything of note and see I have a voice mail. Hmm its 9:00 in the morning on a Saturday, who called me?... The MSP... "the host server went down at 3:00 this morning" the host server? I'm confused before they even finish their voice mail, what do you mean my host server went down? The hypervisor failed? How? ESXI went down? I don't follow. Hung up the voice mail for a second, to gather my thoughts as I stopped paying attention the second they said the host server failed. Redialed the voice mail to really pay attention, what exactly is going on. "all you have to do is login to the server, and open virtual box".... I have to open virtualbox? The verbiage made it sound as if I didn't know what he was talking about, like this was a new lesson, its not I am very familiar with virtualbox BUT. Why is my exchange server living and existing inside of a type 2 hypervisor. It's stability is limited by the stability of the host system which... isn't proving to be all that great.
I did what had to be done, drove in Saturday morning in my finest apparel. Booted up the server, enabled the VM, enabled all the exchange services, make sure they were up and running. And of course figure out what caused the crash. Good ole event viewer, my dearest friend, lets see what you have in store for us today. I expect nothing crazy, disk drive failure, memory blip, something simple that can be ignored. A one time failure we'll never see again, something just oh so generic that wasting my time trying to solve it really proves nothing. That was the existing hope, but not the factual reality. A software proprietary to our MSP crashed shortly after loading, causing a BSOD. This is the server they run, they maintain, its not mine to mess around with all willy nilly, so I did the appropriate thing and called, left a voice mail... but never heard back.
Sunday, attempted to log into the email at 9:00am again... failure... had to replay the games from yesterday. Same cause of failure, same situation, still no response from the MSP, Monday morning will be fun.
Monday, server completely BSOD. Not just a blip and reboot, full fledged BSOD, staring, taunting, agonizing. The exchange migration I wanted to ease into is now going to be gunning it like a drag car, I have no choice I have to try to solve this quick and give the same announcement that the ever so precious email is still down. The stress, the pressure, the anger, all growing greatly from the complete failures of multiple systems. Rebooted the server 3 times, each time resulting in a BSOD after 3 minutes of up time. Finally safe mode was engaged, msconfig ran, and anything pertaining to the MSP was disabled... one problem has seemingly been solved.
Remainder of the week will be dedicated to getting everyone's mailbox switched over to the hosted Exchange.
I am hungry, I am tired, I am caffeinated, I am annoyed, I am stressed, I am worried, I am IT. Until next time, be safe my goblins.
0 comments:
Post a Comment