https://daniel.haxx.se/about.html

https://daniel.haxx.se/about.html

閱讀本文約花費: 21 (分鐘)

This is the story of my background. What I’ve done and how I ended up like this.

Daniel Stenberg

I was born and raised in Huddinge, a suburb south of Sweden’s capital Stockholm. I have two brothers and two sisters.

1985 – it begins

I discovered the joy of computers for the first time sometime in the early 80s when Kjell, a friend of mine, and I entered data sets in Basic that we eagerly read in some of the first C64 magazines at his place and since then I’ve been hooked. Kjell owned a C64 before me so it was in his home I had my first experiences in the computer world. Me and my younger brother Björn then subsequently saved up money for our first own computer that we finally bought together when I was 14 years old, 1985. A Commodore 64. A glorious and marvelous Commodore 64.

I immediately got fascinated by the concept of being able to control the computer and tell it what to do and how. I headed straight into programming and quickly learned BASIC and how to do simple stuff. Soon I realized that the cool stuff we could see other people do and all the games etc were not made with BASIC. What did they use? Assembler.

The three of us (me, Kjell and Björn) dove wholeheartedly into the wonderful world of 6510 assembly. We started hacking demos because we liked watching demos and we wanted to make demos too. We figured out that all the cool demo making people were part of demo-groups and had nicknames and so on, and we felt we too had to join that spirit and quickly founded our own c64 group “Confusing Solution” (we could make fun of ourselves already back then).

This was when I started spending spare time on programming. Up to several hours per day. This is something that I’ve never stopped doing since…

Demo Scene

Due to a happy coincidence, Triad and Fairlight, two of the giant groups on the “demo scene” of the time, organized a “copy-party” in our school (Kvarnbergsskolan) in Huddinge in the late winter 1987. We got sucked deeper and harder into the C64 demo and hacker spirit and community. During that meet-up with hundreds of other C64 geeks we met many like-minded people, released our first demo ever (actually, our first software release at all any category – I was 17 years old by then). We released two more demos as Confusing Solution in the early 1998. We spent more and more of our spare time coding C64 assembly.

Later in the spring 1988, we were invited to a small gathering by our friend Fonzi who was then the leading person in the C64 group called Super Swap Sweden (SSS). When asked, we decided to join their team. At that time, Super Swap Sweden was already a large and well known group in Sweden that made both “cracks” (ie removed copy protection from games and copied pirated versions of them) and demos. We were taken away by the attention and didn’t hesitate to join this large group of friends. We went on and released more demos under the SSS flag, got better and learned more about the C64’s undocumented corners, opcodes and circuits.

Horizon

The three of us (me, Kjell and Björn) and a few other coders left SSS after a while and instead we created Horizon together with a bunch of other demo-hacker friends from the Swedish “scene” (several came from the group Thundercats) and now we were definitely one of the leading demo groups in Sweden. We wanted to have a more tight-knit group that would do and focus on demos only – no cracking at all. We won a whole range of demo competitions in Sweden and Denmark during that period of a few years. We also organized some of the biggest nerd-meetings in northern Europe during the period. So called copy-parties. We would gather more than 500 teenagers from all over northern Europe in a school over a weekend and spend it hacking on code, chat, drink coca cola and then compete in a demo competition toward the end. (Such events would later on get called LAN-parties but back in the late 80s and early 90s we had no LANs…)

The C64 golden age faded away for us – it felt like we were done with that platform and its set of limitations. Several of us looked at making the jump over to the new emerging platform: the Amiga and continue the same activities there – as was common at the time – but the Amiga’s almost “unlimited” conditions (compared to the C64) with lots of memory, super-fast CPU with plenty of registers, blitter (co-processor) and audio chip took in many ways away much of what we considered was the charm of demo-hacking: the strict limits. We only released one demo on the Amiga as Horizon.

(There’s a whole separate story about a different set of people who also called themselves Horizon on the Amiga and who also did demos, but this is not the place to tell that story.)

I did my mandatory military service basically through-out the entire 1990 without a clear direction of what to program next.

Amiga

Instead of continuing with demos, I and Kjell started our ambitious project FrexxEd around 1991 – a customizable and programmable text editor for the Amiga. In that same year – when I was 20 years old and moved into my first apartment I shared with my brother – I debuted in the IT industry professionally by starting a job at IBM. I worked with RS/6000 machines and IBM’s Unix flavor called AIX. This was my first introduction to Unix and C and wow, I was immediately hooked and fascinated by the unix concepts. “Unix is the future!” I said to my girlfriend then (she would later become the Mrs. Stenberg I’m married to today), who of course had no any idea what I was talking about. I learned all this new stuff primarily through man pages. My actual work was probably called something like system installation and setup of RS/6000 machines that arrived to us to get customized and polished before they were sent out to customers.

IBM

At IBM, I learned that there were lots of free source code for programs available. That there is a super cool editor called Emacs which you can do anything with. Much of the inspiration and ideas for FrexxEd which we continued to work on I got through my discoveries and lessons with Emacs on that job. Emacs on the Amiga existed too, but it did not really come to justice there and we thought that we could do better in the (somewhat limited compared to the big unix machines of the times) Amiga environment.

FrexxEd

Basically the only thing I did software-wise on the Amiga was to write FrexxEd. I wrote a dedicated scripting language for it, the Frexx Programming Language (FPL). I made FPL really portable and it ran fine on several unixes as well as on AmigaOS, etc. Meanwhile, Björn (my brother, remember?) wrote up a BBS system under OS/2 that used FPL quite extensively. We ran our dual-line BBS “The Holy Grail” for several years into the 90s.

The name “FrexxEd” was just a playful word using two xx’s which we enjoyed and that habit has followed us later in life too. Basically the Swedish word fräck (translates to “cheeky”) Englishified with xes, and then Ed tacked on to the end of it like many text editors were named at that time. The fact that the name turned similar to the Amiga scripting language Arexx was actually not intentional.

FrexxEd was shareware for very long time. We came from the C64 and Amiga background where FOSS was not a familiar concept and it was not at all existing within that culture – sadly enough, it would have been a really good idea for that community too. Eventually I learned the true ways of life and I released FPL fully open. In modern times people who run one of them new AmigaOS versions have found a renewed interest in FrexxEd and they have ported it over. It is very fun that it is still alive – containing more than 25 years old code of ours. FrexxEd code still exists on github.

Dancer

1993, I started working as a full-time C developer for real (at Frontec Railway Systems) and I programmed embedded devices that measured temperatures of railway wagons’ axle bearings when they passed over the device and its infrared camera – and I came across and programmed on SunOS and DELL Unix as well. I discovered IRC and the fact that there were lots of people out there to talk to. I hung out a lot in #amiga on EFnet, before IRCnet existed. It soon led to me writing an IRC bot on my spare time with a friend (Bjorn Reese) from #amiga – a bot that could be scripted with FPL! We released the bot (Dancer) and FPL fully open source. It wasn’t anything we considered much really, there was never any other consideration. If we could stand on the shoulders of giants and use this large amount of very good software, the least we could do was to also also share our contribution with the world. That bot was written for unix systems primarily (I believe SunOS on Sparc was the system we used) and was my first real application doing TCP/IP networking.

By now the Amiga had completely left my life, and I used my job’s modem pool with dial-back to log on to my employer’s various unix machines to IRC and hack on bots on my spare time. I still spent a lot of time in #amiga and #amigaswe where I got lots of online friends.

Httpget

After the summer 1996, I changed roles at work and I started as a consultant within embedded systems. Frontec Tekniksystem was then the name of my new professional home. At my first assignment I improved a PPP implementation for Ericsson running on pSOS. I then moved on and implemented my own malloc replacement. That was the beginnings of my years as an embedded systems consultant. Almost always working at the customer’s place deeply within their product teams.

One day, later in 1996, it struck me that of course it would be cool to have a service added to the bot where you could ask it for up-to-date exchange rates of currencies. Shopping and prices were often discussed in the channels, so why not offer something that could make the bot say what 100 SEK would equal in US dollars? OK, to make this happen I first needed a command-line tool to download currency rates from a web page at a regular interval.

I found a little tool online called ‘httpget’ which was written by a Brazilian fellow named Rafael Sagula. It fit almost perfectly. It only required a few small fixes and patches first…

Around this time I installed my first Linux systems at work, and we fired up our first public web servers and more. As I had experience from various other unixes from before, Linux wasn’t particularly challenging to install but was still way more interesting due to its price and level of freedom.

I had more or less taken over as leader of the httpget project when I found another currency exchange site that was hosting data and offering it using GOPHER, I had to implement support for that protocol too. And then ‘httpget’ was not a good name anymore so I changed it to ‘Urlget’. But lot long after that, I added FTP support as well and then the step to adding FTP upload support wasn’t very big.

In the late 1997 we registered our first company, Haxx HB, to use as a sort of front when doing odd spare time jobs outside of our regular employments. Another playful name (hack in plural, hacks, but with two Xs instead of “cks”). A couple of years later we converted it into a proper and real corporation; “aktiebolag” in Swedish,

Spare time hacking and full-time work

Already pretty early on in my adult life I established a system that would allow me to keep doing spare time software while still working full-time and spending time with my wife and later on my kids during the day. I realized they need more sleep than I do, so I simply started staying up after they go to bed and I get around two extra hours, totally alone to work on whatever I want.

Two hours per day, every day through decades end up a lot of time. Of course I also spend a little extra at times and during vacations I don’t spend as much.

curl

By the time the urlget tool got the ability to do uploads, the name had became misleading again, so the project was up for a name change one more time and curl was born. curl as in “see URL” or “client for URLs”. Gee, naming things is really hard!

I made the first curl release on March 20, 1998. curl version 4.0, as I kept the version numbering from the previous names.

My interest for the Dancer project faded away slowly and I gave most of my spare time programming focus on curl.

Of course, through time I also dipped into and participated in other projects. I spent a lot of time in hypermail – a program that converts mailboxes to HTML pages. I have written ‘mail2sms’ to convert email to SMS (it was useful in the times before the smartphones), and been working in ‘Smash’ to send SMS messages to operators’ modem receivers. I worked with Trio – a printf and string function library. I have contributed code to and I am involved somewhat in wget. I was an early contributor and committer in the Subversion project. I write and maintain ‘roffit’ – a tool for to create HTML pages from nroff files (man pages).

Licensing

curl had started out GPL licensed pretty much without thought, but after some thinking I decided the GPL approach wasn’t exactly in line with my philosophy.

In 1998 when we released curl 4.9, we switched to the MPL license. It is a very liberal license and was much more in line with what I really wanted people to take away from curl: have them send back code if they actually change the curl code, but otherwise they could do whatever they wanted.

However, MPL proved to be a really unwise choice when we later launched libcurl – curl as a library made for other programs to use. Because the MPL is considered GPL-incompatible, those applications that were GPL licensed could not easily use libcurl because of this license “collision”. Therefore, in 2001 curl was again relicensed. This time to an MIT license. That license has since stuck and I have not regretted that choice ever since.

Of course I realize that people can take our code, change it and ship it with their applications and become millionaires without us ever getting back any changes. But in reality this is not a problem because people do not want to maintain their own forks, their own custom versions of curl. By avoiding a copyleft license we have successfully seen numerous businesses use curl. Companies that otherwise would not have considered using curl.

Rockbox

In the year 2000 lots of things happened. I and several of my friends and colleagues switched employer to Contactor AB, but I basically remained doing the same thing: embedded systems development as a consultant. I got married.

In that period I co-founded the Rockbox project (together with Björn and Linus) and I worked a lot within that project for many years. It was great fun and I met a lot of new friends through that, many of which I still meet and chat with regularly. Rockbox is an mp3 player firmware replacement. We reverse engineered mp3 players and replaced the original firmwares with our free version, that often was far better than the original one in terms of functionality, features and battery life.

Up to that point, curl was just a command line tool. You’d invoke it from scripts or from a shell prompt. I of course suspected that there would be programs and systems out there that could benefit from getting curl’s powers into their applications and that doing curl as a library would enable that. curl was always sort of written with that mind-set internally, but of course it needed some work to make a real and official API out of it.

On August 7 2000, we released the first libcurl version. libcurl 7.1. It was immediately getting used and appreciated by early adopters and it gave me inspiration and energy to continue down that path.

Life 2.0

I continued to hack on curl on my spare time, and work as an embedded systems consultant during my days. In 2003, me and my wife bought a house in a southern suburb to Stockholm and on September 26 our daughter Agnes was born. Life would never be the same again (as every parent knows).

c-ares

Name resolving for applications have always been done with a synchronous function call with the POSIX API and this had been a concern for a while for me and a few friends who at this time had been pondering on starting up a project to work on this problem. One day however, I stumbled over the existing library called ares that did almost exactly what we wanted. I quickly took it to heart and implemented support in curl to use this library to do asynchronous and non-blocking name resolves. Very soon I learned that the maintainer of ares pretty much considered his work done on that code base and he didn’t want to merge the changes I fed back and deemed necessary – for example support for building and working on Windows. I felt that I had no other option than to fork the project and adopt it myself to drive it forward. So I did, and c-ares was born.

IIS funding

When my daughter was roughly a year old, I applied for funding from the Swedish foundation IIS (The Internet Foundation In Sweden) to get some focused development time on curl. I wanted to implement a new API and make it more fit to do really large amounts of parallel transfers. I was given a grant that I worked on during spring 2005 and the multi_socket API was born. Doing 10,000 simultaneous transfers in the same thread became possible. Working from home a few months doing this was awesome.

Adobe funding

In 2006 my second child was born, Rex, and he was still just a few months old when I was contracted by Adobe to work on implementing SFTP support for curl. Adobe wanted to use it in one of their products to complement FTP uploading. SFTP itself being based on SSH protocol required that we could use a proper library to do the binary protocol level parts with so that I wouldn’t have to do the actual SSH bits within the curl project.

I loved getting the opportunity to once again work full time on curl for a few months.

libssh2

I looked around for options and at this time I found two feasible alternatives. Quite amusingly they were named libssh and libssh2 (yes the number two at the end is the only difference in naming). Unfortunately, none of the offered a truly non-blocking API and as my interest was to integrate and use this within libcurl that already had a non-blocking API that was an absolute requirement. So I asked both projects about it. Basically how they looked at the prospect of (me) adding non-blocking support and what they think about it. Both responded fairly quickly from what I recall. One in a fairly dismissing manner suggesting I should use threads instead, and the other in a welcoming and interested fashion. Of course I went with the project that had the better welcoming. I immediately felt welcome and got to know Sara who ran the libssh2 project.

In cooperation with others in the libssh2 project we implemented a non-blocking API and I made curl use this API and starting in November 2006 we could do SFTP and SCP transfers using that.

Sara, the lead of libssh2 changed jobs in 2006 and was as a consequence of that unable to continue maintaining the libssh2 project and pretty soon I took over as maintainer of the libssh2 project.

HTTPbis

I had been working with all these protocols up until now without knowing and not really caring about exactly how protocols are made or how decisions were made about them. But the more I worked with HTTP and all its intricate details, I become aware of differences in implementations and struggles to work with servers that obviously didn’t follow what was written the RFCs. Until someone one day pointed out the HTTPbis working group to me.

HTTPbis was an IETF working group that had been started in 2007 with an effort to refresh the HTTP/1.1 spec. I joined the list and started to follow the development and discussions. I wrote my first post to the list in the spring of 2008.

IETF 75

After gradually having increased my participation in the HTTPbis group over the years, it was a lucky fluke that the 75th IETF meeting in summer of 2009 happened to be organized in Stockholm Sweden. My home town. Since curl and HTTP were primarily hobbies of mine, I had a hard time to motivate the investment and travel budget of going to IETF meetings abroad. But this time the circus was coming to me and now I finally got to meet a lot of the mailing list participants in person for the first time. Friends! This made me even more interested in and motivated to work within HTTPbis going forward.

Developers sometimes ask me if the slowness and bureaucracy of standardization isn’t tedious. For me, working within the IETF is a matter of bringing technology and interoperability forward. To be involved and ensure that the specs get done right, taking the right things into consideration and not go over board to fiddle with things we shouldn’t. It is good for everyone to have a good IETF. I find the spirit and working methods to be very similar to open source.

For example, we carried out work within the IETF to specify how cookies are actually used in HTTP. Cookies had been around for maybe 15 years already at the time and the only spec that actually had been used was less than one hundred lines and totally useless. Attempts had been made over the years to correct it and at least two new cookie RFC were written that failed to get adopted. Finally we started a group within IETF that worked to document how cookies actually work on the web. I felt that I, as an independent and non-browsers orient cookie parser implementor since many years, could provide good feedback and a completely different point of view then most others who were participating – many of them coming from the browser world. I’d like to think my few bits of contribution helped making RFC 6265 as good as it is!

Haxx AB

Professionally, I had spent the last several years doing contract works where I basically had sold myself or I had gotten the jobs myself, while at the same time I felt that my employer wasn’t really going in the same direction that I was going in. I felt that I didn’t really get my money’s worth there. In the end of August 2009 I quit my employment and I instead become the first full-time employee of Haxx AB, our own firm.

In 2009, I’m honored to say that I was awarded the Nordic Free Software Award along with Simon Josefsson for my work in open source and free software up until that point.

Under our own name (Haxx) I continued to do embedded systems contracting. Now being my own boss and of course having the ultimate freedom to decide what jobs to take and how to spend my time and money. I still didn’t get very many curl related jobs more than the occasional smaller hacks and minor improvements (and a series of smaller “I want to automate this using curl can you do it for me please” tasks), so the protocol side remained a spare time occupation.

A few months after me, my brother Björn joined me as Haxx employee number two and a year after, Linus become employee number three. What a glorious development. Looking back, that switch was one of the best decisions I’ve ever done in my professional life.

Haxx was like a dream since forever, transformed into reality. A small number of close friends who are all experts in embedded systems and Linux. We worked as expert consultants and contractors for companies that built various embedded systems Embedded systems today means a very high degree Linux and open Source.

HTTP/2

The HTTPbis working group took upon itself to work on an update to HTTP 1.1 that had been the major HTTP version for many years. It has started to shown its age and HTTP/2 took off from Google’s SPDY. I participated in that work.

Joining Mozilla

In the fall of 2013 I ended a two-year contracting job for Enea AB where I had worked fiercely to kick-start their embedded Linux distribution. I looked – and asked – around my wider circle of contacts and friends to see if anyone had an interesting openening for me to take on next.

Someone did. Patrick Mcmanus worked in the networking team at Mozilla and asked if I wouldn’t be interested to do my next gig for them. I was thrilled to get the oppurtunity, even if I then also had to do that as an employee and not as a contractor. It felt weird to give up that style of life but for this chance I was willing to do a lot. I traveled to Montain View in November 2013 and did seven different interviews in one rather long day…

I started at Mozilla the first days of Janary 2014, in the networking team. HTTP, FTP, DNS, cookies, caching, sockets etc. All day at work. And then all night with curl. Mozilla even allowed me to spend a part of my work time on curl stuff! Mozilla has no office in Sweden, this meant I could work full time from home.

RFC 7540

In May 2015 the HTTP/2 RFC shipped. In relation to the introduction of this new protocol version, I did several talks and presentations to a lot of different audiences in multiple countries. I also wrote a document about it, called “http2 explained” that I relased freely and openly on the web. It turned out a huge success and for the period I counted downloads, I saw more than 200,000 downloads!

Second best developer in Sweden

In 2016, the Swedish online publication called Techworld had a contest for “Sweden’s best developer” – and were accepting nominations from the public. They’ve had this contest before but this time I was nominated and ultimately awarded the 2nd place!

QUIC

The QUIC working group was formed in IETF during late 2016 and I joined the mailing list and subscribed to the github repository at once to keep track of and possibly participate in the development.

US issues

In December 2016 I attended the week long Mozilla all-hands meeting on Hawaii (which was also my 12th visit to the US through the years – yes I’ve had reason go back and actually carefully count the occasions). In June 2017 I was set to travel to San Francisco for another all-hands company meeting, when I was refused to board the flight due to unspecified “problems with my ESTA”. (ESTA is the visa waiver program under which I as a Swede can travel to the US.)

As my employer at the time, Mozilla engaged some people on both the American as well as the Swedish side to try to figure out what was wrong and what we could do to correct the sitution. Unfortunately, no one would offer any clues or information about why I was denied so that effort resulted in nothing.

In the spring of 2018, I reapplied for ESTA, got denied and then applied for a visa instead. I did the final steps for that when I visted the US embassy in Stockholm Sweden for an interview on April 17, 2018.

Polhem Prize

I was awarded the Polhem Prize in October 2017 for my almost 20 years of having run the curl project and its impact on the world. An amazing honor.

At the award cermony, I was handed a gold medal from the hands of the Swedish king himself.

HTTP/3 explained

In November 2018 it was announced that the protocol previously called just “HTTP over QUIC” would officially become HTTP/3. I renamed my new document I was working on and soon I could reveal “HTTP/3 Explained” online. Free and open. Soon enough volunteers joined in and started to offer and contributed to translated versions.

Leaving Mozilla

In December 2018, I left my employment at Mozilla. I had spent almost five full years employed there and it was a great time with many awesome colleagues and friends. Working full-time from home on open source had been awesome, but it was time for me to do something else.

Why did I leave? Two parts: 1. I was bored with the C++ Firefox messy development and getting more bug-reports filed than we manged to close and 2. my manager turned out to be a bully who worked hard to make my life miserable.

A new home: wolfSSL

In February 2019 I joined wolfSSL. The plan is to do commercial curl support and now work on curl as full-time as possible. wolfSSL is an American company and they’re fully aware that I can’t travel there. I’m the only Swede at wolfSSL and I could continue to work from home. I love it.

Visa!

My silly US travel situation lasted until November 9, 2020 when I after 937 days of waiting finally received a visa in my passport. Of course, at this time the Covid-19 pandemic was still ravaging so not the ideal timing to travel anyway.

Future

I never plan never far ahead. I’ve come to terms that I probably will never travel to the US again.

Frontpage

I am the author and maintainer of cURL and libcurl. An internet protocol geek, an open source person and a developer. I’ve been programming for fun and profit since 1985. You’ll find lots of info about my various projects on these web pages and on my GitHub profile.

I participate within the IETF, primarily in the HTTPbis and QUIC working groups.

I speak in public every now and then.

I stream curl development on twitch occasionally.

I live and work in Huddinge, just south of Stockholm, Sweden.

I treasure my wife and two kids.

I work for wolfSSL. I do commercial curl support. If you need help to fix curl problems, fix your app’s use of libcurl, add features to curl, fix curl bugs, optimize your curl use or libcurl education for your developers… Then I’m your man. Contact us!

https://daniel.haxx.se/about.html

Rate this post

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注