My Ruby Development Environment
by Jim Myhrberg – tags: , ,

Setting up your development environment is always a tedious task. My own environment has changed many times over the years. I've recently gotten my Ruby-related setup to the point I'm finally really happy with it.

This is by no means a complete in-depth step-by-step guide of how to setup your environment like mine though. Instead it's meant as a quick reference of the tools I use, and how I use them. If you were looking for a magical silver bullet, this article is not it. If you're looking for an exciting adventure and treasure hunt, this article is hopefully it.

Ruby

I install and manage multiple Ruby versions with rbenv and ruby-build. They are not as established as RVM which has been around a lot longer, but I prefer rbenv for it's bare-bones simplicity. If you're coming from RVM, the main thing that you'll miss is it's gemset feature, which won't be an issue if you use Bundler properly. There is however a gemset plugin available for rbenv.

To install rbenv, check the ReadMe on the project page. I prefer the Git checkout method. ruby-build has installation info on the project page too, but on OS X I prefer installing it with Homebrew.

Once both are installed, you can install your Ruby version of choice, for example:

$ rbenv install 1.9.3-p0

Then set your global Ruby version:

$ rbenv global 1.9.3-p0

I tend to install a very basic set of gems, as all project-specific gems will be managed by Bundler. So obviously bundler needs to be installed:

$ gem install bundler

With rbenv this does not create the bundle executable however, so the next step is to run:

$ rbenv rehash

This creates the bundle executable in ~/.rbenv/shims, and also any missing executables for other gems you have installed.

Gem Management with Bundler

Bundler is fantastic, but if you just run bundle install as default, I would argue you're not actually using Bundler correctly as it installs the gems into your Ruby verion's gem path. One of Bundler's great features is that you can keep gems completely self-contained within a project. For that reason I use the --path option, to install gems into vendor/bundle relative to the Gemfile.

And because I'm lazy, I have a handy bash alias for my bundle install command.

alias bi="bundle install --path vendor/bundle --binstubs=vendor/bundle/bin"

The --binstubs option there leads me into how I avoid typing bundle exec before every command. It tells Bundler to package binaries from all the installed gems into vendor/bundle/bin directory within the project. Simply add the following at the very end of your ~/.profile or ~/.bash_profile:

export PATH="./vendor/bundle/bin:$PATH"

This enables you to call all of the project's gem binaries like normal, but they're Bundler aware, as if they'd been called with bundle exec.

I also have a few bash aliases for bundle exec ... which I find useful:

alias ru="bundle exec ruby"
alias ra="bundle exec rake"
alias rs="bundle exec rspec"
alias ca="bundle exec cap"
alias cu="bundle exec cucumber"

Update: Instead of using an alias to set Bundler options, you can set default Bundler config options in ~/.bundle/config. Mine looke liks this:

---
BUNDLE_PATH: vendor/bundle
BUNDLE_BIN: vendor/bundle/bin

Run bundle help config for more information.

Running Ruby Apps

For running web-based apps I use Pow and/or Foreman. Pow is my favorite of the two, but for certain projects Foreman is the better match.

I tend to go on a case-by-case basis. For example, some projects might need a few background workers, I tend to start all of them with Foreman, while I might run the web-based frontend with Pow.

MySQL, Redis, and More...

Because I run Mac OS X, I use Homebrew to install things liky MySQL, Redis, Memcache, and others. If you're not on OS X, you'll have to find your own preferred way to install these kinds of tools. But I'd imagine your operating system has some form of package management system you can use.

Comments

Concept: Decentralized Zero-Conf VPN
by Jim Myhrberg – tags: , ,

Imagine a new kind of VPN service which doesn't require any kind of configuration and magically just works without a central server.

Who Am I?

Before I go on, I should point out that I have no background in network infrastructure, P2P network development, or pretty much any of the specific technologies this touches. My background lies in web development, and this idea is based on my understanding of these technologies. As such, please point out any errors, misconceptions or other issues you might find. But keep in mind, this is simply an idea of a potentially awesome technology.

The Idea

For a while now it's been bugging me how messy it can be to get a VPN up and running. Or just getting setup through any means to gain remote access to different computers. Yesterday I had an idea somewhat based on Hamachi and BitTorrent.

Imagine a VPN service which instead of connecting to a central server, uses BitTorrent's DHT implementation to find peers which are part of the same “network”, just like BitTorrent clients finds peers who have the same torrent. Once peers have been found, secure and encrypted connections are setup between the local machine and all remote peers, creating a Virtual Private Network across the Internet which in essence works a lot like Apple's Bonjour technology; Everybody talks directly to everybody.

The Details

I believe the concept is rather simple, but implementing it could be another story. From a functionality point of view, these are some of my initial ideas/notes:

  • The networks you are connected to are managed by “key files”, containing the following:
    • A random unique hash string which identifies the network when the client searches for peers via DHT.
    • An encryption key (or a set of encryption keys), which are used to encrypt all traffic between peers.
  • Create a network: You generate a new key file which is populated with a random hash signature and encryption keys for you automatically. Somewhat like how SSH keys are generated.
  • Join a network: You simply copy the key file from one computer to another. The second computer will find the first computer via DHT and start communicating with it automatically.
  • PEX (Peer Exchange) can also be used to faster discover all peers in your Network. Meaning once you find one peer via DHT, it will tell you about all peers it knows about.
  • Once peers are connected to each other, Apple's Bonjour technology (or something similar) could be used to get around potential IP conflicts and the like.
  • In theory, the VPN client could behave just like a BitTorrent client when searching for peers via DHT. Once a peer is found however, it uses it's own protocol to setup a secure VPN connection. This could allow us to take advantage of BitTorrent DHT nodes already online, effectively piggybacking on BitTorrent users.
  • Everything should be open source. Cause really, what's the point otherwise? :)

Conclusion

To me, the coolest and most interesting point is piggybacking on existing BitTorrent DHT nodes. If this is something BitTorrent client developers will consider evil, I don't know.

That's why I'm writing this post. I'm hoping to get some feedback, suggestions, insults, and ideas from people who are smarter, and know a lot more about these kind of technologies than I do.

In terms of an actual implementation, personally I would love to build this thing. But my networking and low-level programming skills are not very great at all, it would take some time before I could have anything presentable. But if someone else might be interested in starting it, I would very much like to be part of the effort, in any way I can till I've gained the required skills.

Comments

Fan Boys & Haters, Seriously?
by Jim Myhrberg – tags: ,

Fan Boys and Haters have been equally annoying me as of late. The most common claim is that their favorite company is good, while all others are evil. This is simply bullshit.

Companies are not good or evil. They are simply companies. They are here to make a profit at the end of the day, and to that means, they'll do things their customers, rivals, and the rest of world likes, and things they don't like.

Apple has gotten a lot of media attention the past two years thanks to their app approval process for the App Store. Most recently, they're getting a lot of attention for locking out all advertising platforms aside from their own iAds platform from the App Store.

Microsoft was working with IBM in late 80s to develop IBM's PS/2 operating system as the OS of the future. After it had shifted it's internal focus to Windows, it kept face with IBM, and kept pushing 3rd party developers to port their existing DOS applications to PS/2. Windows became a huge hit, PS/2 was abandoned, and a whole industry of software development houses had spent months to years porting their applications to PS/2. The result was, that nobody had anything ready for Windows, except Microsoft, who's office package displaced all 3rd party products with their own for Windows, making it an industry standard, as there simply was no other package available.

In more recent years, Microsoft created it's Windows Media-based PlaysForSure™ standard for portable music players, and got the whole industry (except Apple) to adopt the standard. Little than a year later, Microsoft releases it's own portable music player — the Zune — which is not compatible with PlaysForSure™ devices, but uses a new Zune-specific DRM standard. In the mist of this, all the PlaysForSure™ partners were left hanging with a standard abandoned by the company that created it and owned the rights to it.

Even Google, with their "Don't be evil" corporate motto don't get away without stains. It was recently revealed that Google's Street View cars have accidentally been collecting payload data from all open wifi networks they drive by, all over the world. On an unencrypted network all your emails, webpages you visit, chats, and more are transmitted over the air in plain text. This plain text data is what Google has been collecting and storing for the past three years, accidentally. This is one of biggest and worst invasions of privacy I've seen in a long time at least.

Companies do bad, and even evil things, but they also do good things. Neither of this makes a company good or evil in my opinion though. They are just companies, so stop treating them as people, religions, or any kind of entity that has the ability to be good or evil.

I tend to get labeled as an Apple fan boy, and a Microsoft/Google hater a lot. The reasons for it are mainly valid all things considered, but they're still wrong. The reasons being that I'm a Mac and an iPhone user. And I love both devices/platforms. In all honesty, I do hate Windows, as it's only ever really been a major pain in my ass, specially back when I was a Windows user and still to this day as I'm always the one everybody comes with when their PC breaks for any reason. That said, I actually do like Windows 7, but could it replace Mac OS X for me? No way. Could an Android phone completely replace my iPhone? No way. It's close, it's the 2nd best thing next to the iPhone, but still no.

I don't care what companies do, what their policies are, or even how stupid or intelligent their CEO is. What I care about is the product, the experience it offers, and how well it fits what I want, need and like.

I use a Mac, cause it's the perfect world between the commercial application availability of Windows, and the open source UNIX sub-system of Linux/UNIX systems. I use an iPhone cause of it's easy of use, it's wide selection of apps, it's unix sub-system available after Jailbreaking, and cause of how well it integrates with everything from my music library, photos, contacts, password managers and more.

To summarize what I'm saying:

I don't give a shit about the companies and their policies. I only care about my own personal experience with the products.

To point out how serious I am about this, in the spring of 2004 when I bought my first Mac, and I seriously hated Microsoft and Windows after having more problems with my PC thanks to Windows than I can count. Guess what mouse I bought for my brand new €2800 PowerBook G4. A Microsoft Optical Notebook Mouse. It was simply the nicest, and best mouse for what I needed and wanted. In the autumn of the same year, I bought an Xbox as well, and I loved it.

I hated Microsoft for all the problems and headaches I'd gotten thanks to Windows, but at the very same time I'd bought a cheap and an expensive Microsoft product, and I loved them both.

The same goes for the more recent "war" between iPhone and Android users. I actually own an HTC Magic, and I use it quite a lot — in fact, it's the phone I have in my pocket right now — but not as much as my 2G iPhone. Despite my iPhone's wifi being broken, I still generally prefer the iPhone cause of the experience and integration it offers me compared to the Android phone. Do I care about Apple's policies regarding iAds? Do I care about Google collecting payload data from open wifi networks? No and no. If the product is superior than all others for my needs, I'll use it.

So please, next time you come and tell me that you refuse to use ______ company's products cause they're evil, or you think their policies suck, or you think I'm a blind stupid fool cause I use ______ company's products, just shut the fuck up. I'm not interested, and I don't care.

Thanks for reading my quick little rant.

{insert picture of a unicorn caring for a cute little kitten here}

Comments

Get YouTube Video Bookmarklet
by Jim Myhrberg – tags: , ,

There's a few ways to download videos off of YouTube, my favorite is using a bookmarklet which injects a link onto the page to download the MP4 version of the video. The best of these bookmarklets was by someone over at the Google System Blog, it was well maintained and all. Unfortunately, the post about their bookmarklet has been deleted, and as such, it hasn't been updated for the new video page layout YouTube switched to a few weeks back.

I took it upon myself to update the last version the GSB guys posted to work with the new layout, and it turned out to be dead simple. So without further ado, here's the bookmarklet:

Get YouTube Video

To install the bookmarklet, simply drag the above link to your browsers bookmarks bar. To use it, simply open a YouTube video page and click the bookmarklet. You will get a link right above the "Like" button underneath the video saying "Download as MP4". Simply right click on the link and select "Save link as..." or however your browser of choice is wording it.

I'll try to keep the above bookmarklet up to date as often as I can, or notice that I can't download videos anymore myself. So check back here if your bookmarklet stops working.

Comments

LiteMySQL: ActiveRecord's Little Brother
by Jim Myhrberg – tags: , , ,

Ever needed a quick and lightweight MySQL PHP library for some small single/multi page project? Pulling out a full ORM is just overkill, but writing the PHP code needed to connect to the server, run a query, and process the results is a lot of hassle? I thought so. I've been there too.

So what if you could do something like this:

<?php
$sql = new litemysql('host', 'username', 'password', 'testdb', 'books');
$books = $sql->find_all(array("author" => "John Twelve Hawks"));
?>

Rather than something like this:

<?php
$db = mysql_connect("host", "username", "password");
if (!$db) die("Could not connect: " . mysql_error());
$db_selected = mysql_select_db("testdb", $link);
if (!$db_selected) die("Can't use testdb : " . mysql_error());
$result = mysql_query("SELECT * FROM `books` WHERE `author` = 'John Twelve Hawks';");
$books = array();
while ($row = mysql_fetch_assoc($result)) {
    $books[] = $row;
}
?>

About two years ago, I found myself in need of just such a small, lightweight library. There were full ORMs like ADOdb and others available, but they were serious overkill for the kind of simple stuff I needed. After some googling, I noticed there didn't seem to be any in-between libraries. Either you had to go for a whole jumbo jet, or start folding your own paper airplanes, and I didn't like either option. I just wanted to grab existing paper airplanes, and start throwing them in the direction I needed them to go.

So in true geek fashion when you want something ready-made to make your life easy, I ended up building my own such library. Spending an order of magnitude more time on the MySQL connection part of the project than I would have needed if I'd just done it the ugly way as the second code example above.

And that's how I started building LiteMySQL. Rather than building it from the ground up though, I started by lifting the essential parts from the full ORM/ActiveRecord implementation I'd written for Zynapse, and mainly just wrote glue-code to make it a feature-complete library.

After close to a year of being ignored, I recently spent an afternoon fixing some long-standing bugs, migrating to GitHub, and writing some decent documentation for the project.

You can read more, and download the library here.

Comments

How Are You?
by Jim Myhrberg – tags: , ,

*** Initializing analytics system.
*** Initializing health check sub-system.
*** Booting health check sensors.
*** Running physical health checks.
[Warning] Optical subsystem initialized with errors:
[Error] Focal systems out of optimal range.
*** Running psychological health checks.
[Error] Corrupt system file: human-to-human-communication.dylib
[Warning] H2H protocol daemon is reporting malformed I/O.
*** Running humor health checks.
[Error] Syntax error in humor.conf.
*** Running spiritual health checks.
[Error] Spirit is corrupt or missing, attempting to use system default.
[Error] Default system spirit is missing.
[Error] No usable spirit found, system entering zombie mode.
*** Finalizing health checks.
*** System status: CRITICAL
*** Status summary: Visually impaired anti-social Zombie with
                    corrupt sense of humor.

This was originally a response I wrote to some chick I don't know who messaged me out of the blue asking "how are you?". After sending it, I thought it was funny enough to post here... lol

Comments

Zynapse Is Out
by Jim Myhrberg – tags: , , ,

So I finally bit off my own sense of perfection in regards to my Zynapse web-development framework for PHP5 which I started developing in 2007. It was never perfect, never ready, so I never released.

Now after close to two years of hardly working on it, and doing most things with Ruby on Rails, I've decided to push it up on GitHub as is.

So check out Zynapse on GitHub if you're interested.

Comments

Built-in Sudo for Ruby Command-Line Tools
by Jim Myhrberg – tags: , , ,

I was looking through my gists today on GitHub, and decided I'd do a couple of posts on some of the pieces of code I've put up there. The first of which is the sudome Ruby method.

Ever written a command-line tool in Ruby that requires root access for one reason or another? The simplest way to achieve this is to have the end user call the command via sudo. It's not the most elegant solution there is, but it works.

A more elegant solution might be what the Fink Project is doing with their fink command. It doesn't need to be run via sudo, as it calls it within itself. Meaning that when you run fink, you'll be prompted to type your password, just as if you had used sudo. Some might argue that this is not good practice, and they are probably right. But it all depends on the details of what you're doing.

A while back I was working on something which the best solution was to make sure the tool always runs as root. To get identical functionality as Fink, I wrote the very simple method shown below:

def sudome
  if ENV["USER"] != "root"
    exec("sudo #{ENV['_']} #{ARGV.join(' ')}")
  end
end

Simply call sudome as early as possible in your code. If needed it will re-run your script with sudo, requiring the user to type his password, at which point your script then has full root access to the system.

» Original Gist on GitHub

Comments

JavaScript Performance Wars
by Jim Myhrberg – tags: , , ,

Is the difference between Chrome's V8 engine, and WebKit's SquirrelFish Extreme (SFX for short) significant enough that we need to care if we use Chrome or Safari/WebKit?

After some quick Googling for recent posts and comparisons of V8 and SFX, it seems nobody has bothered doing any comparisons after late 2008. Maybe I'm just completely outta the game, and the performance of the two engines is now common knowledge. If that's the case, it's knowledge I sure don't have.

I set out to do a quick (and unscientific) performance test between the latest released beta/development build of Chrome, and the latest WebKit nightly. I decided to use SunSpider as it's still highly respected from what I know, and a test only takes 2-4 minutes instead of 15 minutes with Dromaeo. For good measure I also threw in the latest alpha build of Opera, and the latest shipping versions of Safari and Firefox.

The Results

SunSpider Results

I don't know about you, but a 4.6 ms difference between Chrome and WebKit is something I really don't care about. I don't even care about the 54.6 ms difference between Chrome and the latest shipping version of Safari. Specially not since I've been using WebKit nightlies the past 8 months with no more or less issues and/or crashes than with the standard Safari release.

Something that did grab my attention however was how well Opera 10.50 Alpha did with it's new Carakan JavaScript engine. With that said, Carakan was eating about 80-90% CPU time during the tests, while all other browsers only used about 30%. Firefox's TraceMonkey engine was unsurprisingly the slowest.

I'm looking forward to see what Opera does over the next couple of years with their desktop browser. Both in terms of performance, and in terms of page rendering. If you're following me on Twitter, you may know that from a web designer's point of view, I've had more issues and headaches with Firefox 3 lately than with Opera 10.

Detailed Results

I ran all tests with a freshly relaunched browser, with no pages open aside from SunSpider. The test system was my 2007 MacBook Pro with a 2.4GHz Core 2 Duo processor and 4GB Ram running Mac OS X 10.6.2.

Below is a list of the specific browser versions I used, and the detailed SunSpider results.

Don't take my findings too seriously however, cause I didn't have time to do multiple repetitive tests, rebooting the system between tests, or really much of anything to properly ensure the tests are 100% accurate. If you need seriously accurate numbers, run the tests yourself.

Comments

Was I Really “That” Social?
by Jim Myhrberg – tags: , ,

As some of you might have noticed the last couple of days, I haven't been online much on IM networks. I'm not sure what originally kept me from launching Adium the other day, but along the way I've come to a realization.

The Problem

I realized that IM has been taking up way too much of my time, and constantly distracting me from both work and personal matters. You just can't hide from the tsunami of Growl notifications, bouncing and flashing Dock icons, plinging sounds, and more that almost all IM clients spew out in one form or another whenever you receive a message.

A friend of mine gave up on IM networks almost a year ago, for pretty much same reasons. — Yes, I'm linking to Sugarenia again. She always seems to have the same opinions as me, just days/months before me. Dammit!

The Solution

Unfortunately I couldn't bring myself to completely disconnect from IM networks, so I took a slightly different approach than Sugarenia. I've gone out of my way to make sure my IM client — Adium of course — doesn't notify me via Growl, Dock icon, sound, telepathy, or even alien abduction. This means, I'm online, I can chat, but unless I manually switch to the space/virtual-desktop where my IM client sets up camp, I don't even know if I have any new messages.

After a couple of days, I have to say it's a quite nice change. I only get distracted when I choose to check if there are any IM messages to respond to.

So now you know why it might take me 8 hours to respond to a chat message. I simply didn't want to check for new messages, or I completely forgot cause I was hopefully working along all excited on my next attempt at building an awesome online service which will hopefully pay my rent, and maybe even food.

Comments

About That “Pad” Thing
by Jim Myhrberg – tags: , ,

So I was gonna write a post with my opinions about the iPad, but a cup of tea and staring at wall of wet paint is almost more tempting. If you don't get why the iPad is important, and why it will succeed, I'm not even gonna try convincing you otherwise, time will just prove you wrong and make you feel stupid.

By this point you're then thinking “What the fuck is this post about then?”. Sugarenia just wrote up a quick rant regarding her opinions on the iPad, which pretty much sums up my own opinions to the letter. Only she's written and explained herself much better than I most likely would have.

The iPad is not made for you and me, fellow geek. It’s primarily targeted to people that are still afraid of interacting with PCs, those that don’t have a clue about drivers and web apps and Wi-Fi setup. And this is exactly the kind of people that won’t buy a Linux netbook, dear Open Source zealots – because as much as Ubuntu has made Linux user-friendly, there’s still much filling that shows between the seams.

Read the full article, please.

Update: I just noticed a post by another friend of mine which is also good:

I think the main reason why self-described geeks are throwing a fit over the iPad is that it's a shiny new toy that's not meant for them.

Read the full article.

Update 2: Seems using Jekyll can have some downsides. I accidentally named the file for this post 2010-05-06-[...].md without noticing that it was labeled as posted “06 May”. I've corrected it, but the permalink to this post has changed. *facepalm*

Comments

Automated Profile Picture Update Service?
by Jim Myhrberg – tags: , ,

After I updated my profile picture today, a friend of mine responded with:

Build a service that changes your profile picture in all social networks!

He also didn't like the new profile picture, but that's his problem. My first response was fuck off “Gravatar?”. Obviously he didn't mean Gravatar, I just mentioned it to annoy him.

A service which automagically just updates your Facebook, Twitter, Flickr, YouTube, Vimeo, Gravatar, ... profile pictures would be quite cool. Thinking about it a bit more, there are three problems with building such a service:

  • A lot of these sites you will need to crawl programmatically using some kind of web-crawler library. It will be a pain to write, and even bigger pain whenever they change anything in the HTML layout of their pages.
  • Some sites have specific and/or strange restrictions for image dimensions, file size, and even file format.
  • Will people actually trust such a service with passwords for all of their online social networking accounts?

The later problem, trust, is definitely the biggest one. And I'm not sure you could overcome it unless the service is officially sponsored and/or operated by Google or somebody. I do think it could be a fun project to undertake, but I think it's pretty doomed right from the start unfortunately. Although if I updated my profile pictures more than once every 4-5 years, I might just build a prototype for myself at least.

Comments

New Site and Blog Powered by Dr. Jekyll
by Jim Myhrberg – tags: , , , , , , ,

I finally found some time to rebuild my site, and add a blog. I'm also working on a portfolio, which I will probably be putting up on heartb.it. I haven't really decided how I'm gonna make the split between my personal site and work portfolio yet though.

For archival reasons I've made the previous versions of my site available for anybody who might be curious.

On an unrelated note, it happens to be my 24th birthday today, I haven't decided yet if that's a good or a bad thing. But at least I found time to push up my new site today, so I guess that's a good start at least :)

Powered by Dr. Jekyll?

My personal site has always been a very simple site. In the past it's just been a single HTML page which I've coded by hand and uploaded via SFTP. It's a simple process, and decently straight forward. This time however, I wanted to incorporate a blog as well. My first choice was WordPress, as I've used it on my previous blog. But it's overkill for what I need, and keept getting hacked all the fucking time even when I was keeping WordPress decently up to date.

So I'm using Jekyll this time around. Jekyll is a small website framework written in Ruby which generates static HTML files. It was created by one of GitHub's founders, and is used on GitHub Pages. Part of what makes it nice is that it's more intended to be a quick and elegant blogging engine, rather than just a static site generator. It let's you write blog posts in pure HTML, Markdown, or Textile. Meaning I'm writing this post in TextMate, which always puts a smile on my face.

I'll soon write a more in-depth article about Jekyll and how I'm using it.

Comments with Disqus

Since I'm using static HTML files, I'm left with only a few — but awesome — solutions to have a commenting feature on the blog. Both Disqus and Intense Debate have great Javascript-based commenting systems which work for static HTML sites. My favorite of the two is Disqus.

Deployment with Rake+Rsync

I've also opted for a much easier way to deploy to the live server once I'm done with changes locally. Namely, Ruby's Make program, Rake.

I've written a couple of custom rake tasks which run Jekyll to build the static HTML files, and to rsync said HTML files to the remote server. So instead of using a SFTP client, or something like Coda to upload and update the remote site, I simply run rake deploy from a terminal.

I get butterflies in my stomach whenever I think about how neat it is.

Source Code Management with Git

After being an avid user of Subversion for about 5 years, I switched permanently to Git last August when I spent 4 hours reading a PDF I had with me on holiday. So I'm obviously using Git for this site, and the source code is available on GitHub in all it's glory.

Design

I really focused on minimalism, to the point I'm not using a single image, but rather only text on a white background. This is a first for me, as I generally like to have nice rounded corners, or drop shadows, or something, but still simple and elegant looking.

Since the design in highly text-focussed, good typography was a must right from the start. I wanted to stray away from the standard web-safe fonts, to create a truly unique and elegant looking site in terms of it's typography. To do this, I needed to embed fonts, and I used the @font-face technique for it.

The two fonts I'm using are Colaborate for body text, and DejaVu Sans Mono for fixed width text and code examples. I got both from Font Squirrel's excellent @font-face fontkit page which has hundreds of free and ready to use kits.

The End

{insert yo mamma joke here}. Have a nice day.

Comments

archive
rss