The beginning

Your awesome Tagline

4 notes

Sense: base64 in c++

samessence:

Implementation of Base64 in c++

base64.h:

#ifndef _BASE64_H_
#define _BASE64_H_

#include <vector>
#include <string>
typedef unsigned char BYTE;

std::string base64_encode(BYTE const* buf, unsigned int bufLen);
std::vector<BYTE> base64_decode(std::string const&);

#endif

base64.cpp:

1 note

Linus Torvalds makes procrastination an art form

So here’s the random trick of the day: say you decided to finally upgrade your monitor due to a random discussion on G+, but it turns out that you haven’t upgraded your desktop in a while, so you’re stuck with single-link DVI.

And the fancy new monitor is a 2560x1440 one that requires dual-link DVI to drive it, so says the documentation in big letters. What do?

Of course, you could just try to find a HDMI cable, since I suspect the machine is still new enough that it happily does HDMI at pixel frequencies high enough that it would all work fine. But you’re a lazy git, and you can’t find a cable anywhere. And by “anywhere” I mean “lying right there on my desk, not under a pile of paper”.

So rather than waste your time with trying to find hardware you may or may not have, just say “hey, I’m not playing games anyway, so why not just drive that thing with a single DVI link at 30Hz instead of the 60Hz it wants. It’s going to buffer the data somewhere to see if it needs to stretch it anyway”. 

And if you are that kind of lazy git, here’s what you do:

Step 1: calculate the VESA timing modes for 2560x1440 at 30Hz. You could do this by hand if you were a real man, but we already covered the whole “lazy git” part. So use the “gtf” tool (no, that’s not random noise, it means “Generalized Timing Formula”, it’s part of the VESA standard for how the pixel signal timings are supposed to look like)

Running “gtf 2560 1440 30” spits out the following lovely turd, bringing back bad memories of X11 config files. There’s a reason we don’t do them any more, but people still remember it, and get occasional flashbacks and PSTD:

# 2560x1440 @ 30.00 Hz (GTF) hsync: 43.95 kHz; pclk: 146.27 MHz
Modeline "2560x1440_30.00"  146.27  2560 2680 2944 3328 1440 1441 1444 1465  -HSync +Vsync

Yeah, G+ will completely corrupt the formatting of those two lines, but for once it doesn’t really matter. It looks like noise regardless of formatting. It’s not meant for human consumption.

Step 2: tell ‘xrandr’ about this mode by just copying-and-pasting the numbers that gtf spit out after incanting the magic words “xrandr —newmode 2560x1440”. So the command line looks something like 

xrandr --newmode 2560x1440 146.27 2560 2680 ...

which will quietly seem to do absolutely nothing, but will have told xrandr that there’s a new mode with those particular timings available.

Step 3: tie that mode to the list of modes that the HDMI1 output (which is what is connected to the DVI output, which you would have figured out by just running “xrandr” without any arguments what-so-ever) knows about:

xrandr --addmode HDMI1 2560x1440

Again, absolutely nothing appears to happen, but under the hood this has prepared us to say “yes, I really mean that”. Lovely.

Step 4: actually switch to it. This is where the monitor either goes black, spectacularly blows up, or starts showing all its pixels the way it is supposed to:

xrandr --output HDMI1 --mode 2560x1440

Ta-daa! Wasn’t that easy? Never mind what the manual says how you should use this monitor, we have the technology to do better than that. Or, in this case, worsethan that, but whatever.

Now, obviously any sane person would ask himself why the GTF calculations aren’t something that ‘xrandr’ just knows about, and why this isn’t just a single command to say “please switch that output to 2560x1440@30”. Why all the extra steps?

The answer to that question? I have absolutely no idea. Graphics driver people are an odd bunch.

(Source: veterancommander)