Note: This is basically a
notes page for myself to dump in computer-related stuff that I think I
might want to reference in the future. If it's of interest to
you, great, otherwise, you might want to just ignore this, as it has no
particular focus other than that it is computer related....
Lyle
"In this day and age, computer systems are getting faster and more
capable, but they still do not eliminate the need for a sensible,
intelligent person to run the show. Computers will never be 'smart
enough for any fool to use.' ... When you go looking for a software
package, don't just look for which one has the most automation. Don't
believe that because it has all that automation, it will make your job
or your life easier. It won't. ... There is no substitute for using
your own brain to get a job done right." - Howard Chu of Highland
http://www.directron.com/firewirevsusb.html
Firewire vs. USB: A Comparison
By Nathanael Copyright © Directron.com 2005
The following article is based on years of experience. It is provided
as a free service to our customers and visitors. However, Directron.com
is not responsible for any damage as a result of following any of this
advice.
Copying the contents for commercial purposes is strictly prohibited
without Directron.com's written consent. However, you are welcome to
distribute these computer support tips free to your friends and
associates as long as it's not for commercial purposes and you
acknowledge the source. You are permitted and encouraged to create
links to this page from your own web site.
If you have used a computer within the past five years, chances are
you've have used USB (Universal Serial Bus) devices many times. From
mice and keyboards, to printers and external hard drives, USB devices
are nearly ubiquitous ?in fact, over 1 billion USB devices have been
sold.
On the other hand, Firewire-based devices are somewhat less prevalent.
Nearly all digital camcorders sold after 1995 have included a Firewire
connection, as have all modern Macintosh computers. Additionally, many
new external storage devices include a Firewire connector. However,
lower-end devices such as mice and printers are rarely (if ever) seen
with Firewire connectivity.
In the next two sections, we will look at the history of these two
similar I/O ports. Then, we'll compare them side-by-side to see which
technology is better for specific applications.
USB: A Brief History
Version 1.0 of the USB specification was released in January of 1996 by
the USB Implementers Forum (USB-IF) and was followed up by version 1.1
in September of 1998. A theoretical maximum of 127 devices per
controller is specified. Both versions 1.0 and 1.1 support a maximum
transfer speed of 12Mbps ("Full Speed") and can fall back to 1.5Mbps
("Low Speed") if need be.
Note that these data rates are in Megabits (Mbps) per second, as
opposed to Megabytes (MBps) per second ?a commonly confused notation.
USB version 2.0 was released in 2000, upping the theoretical maximum
transfer rate by a factor of 14 to 480Mbps ?dubbed "Hi-Speed". USB 2.0
devices are backwards-compatible with USB 1.x devices and controllers,
and can fall back to "Full" or "Low" speed in order to coexist with
older devices. Nearly all new products on the market are USB
2.0-compatible.
Both USB 1.x and USB 2.0 allow the use of two separate types of
connectors ?Type A and Type B ?depending on the requirements of the
device itself. Type A connectors are almost always used on the host
side (computer or hub), while Type B connectors are smaller and are
frequently found on the device side in printers, scanners, and other
similar hardware.
Both types of connectors can provide up to 500mA (milliamps) of power
to connected devices, though devices that require more than 100mA
should be self-powered as each USB port generally has a maximum of
500mA of power to share between all devices. A device that draws all of
its required power from the USB bus is referred to as a "bus-powered"
device.
Windows 95 OSR2 (OEM Service Release 2) included limited support for
USB; the original release of Windows 95 had none. Windows 98 ?and more
importantly, Windows 98 SE ?added much better support for USB, but
Windows XP's USB support is the best and most robust, by far. Apple's
Mac OS has supported USB devices since prior to version 9.0.4, but this
release of the operating system added substantially better support.
Firewire: A Brief History
The origins of Firewire date back to the mid-1980s. Engineers at Apple
Computer devised a high-speed data transfer technology for Macintosh
internal hard drives they called 'Firewire'. Realizing the potential
for a technology that allowed high-speed transfer to and from
hot-swappable devices, Apple presented this technology to the Institute
of Electrical and Electronics Engineers (IEEE).
In December of 1995, the IEEE released an official Firewire
specification, dubbed IEEE 1394. This specification, sometimes referred
to as 'Firewire 400', describes a hot-swappable peripheral interface
with transfer speeds of 100 Mbps, 200 Mbps, and 400 Mbps. During the
late 1990s, this standard found its way into Sony electronics (mainly
digital camcorders) under the title 'i.LINK'. In January of 1999, Apple
released what was probably the first personal computer system to
include Firewire ports by default: the Blue PowerMac G3. All Macintosh
models from then on have included Firewire connectivity.
Firewire cables come in two variations ?4-pin and 6-pin. 6-pin cables
provide up to 30V of power, allowing for fully bus-powered devices.
4-pin cables do not provide power.
In April of 2002, the IEEE released an updated Firewire standard,
dubbed IEEE 1394b. IEEE 1394b allows for theoretical maximum transfer
rates of up to 3.2Gbps. Apple commercially released a subset of this
new standard under the title 'Firewire 800' in 2003.
Firewire 800 devices support a maximum transfer speed of around
800Mbps. Firewire 800 adds a new cable type ?9-pin cables (also called
'beta' cables), which support the full speed of Firewire 800.
Firewire 800 is backwards-compatible with Firewire 400 when 'bilingual'
(9-pin to 6- or 4-pin) cables are used. Firewire 400 devices will still
run at Firewire 400 speeds, even when connected to a Firewire 800 host.
The Comparison
General Peripherals: USB Wins
USB has almost completely replaced older I/O connectors such as
parallel, serial, and MIDI (joystick) ports. Instead of a confusing
collection of incompatible devices and connectors, you have a
one-size-fits-all connection that works on nearly all PCs manufactured
in the last 5-8 years.
While USB has not completely replaced PS/2 ports, USB mice and
keyboards are readily available. Nearly all recent scanners and
printers have USB connections, as do most other low-bandwidth
peripherals.
On the other hand, Firewire is almost completely absent from this
category. As mentioned before, Firewire is impractical for
low-bandwidth devices; this, coupled with the fact that most computers
(besides Macintoshes) do not include Firewire ports by default, has
kept Firewire-enabled devices in this category out of this market.
Digital Imaging/Digital Video: Tie
Firewire is much more prevalent in this category. Almost all modern
digital camcorders come with Firewire connectivity. Because of
technical differences, Firewire is a better bet for transferring
uncompressed (raw) video from digital camcorders, even though USB 2.0
has a higher maximum speed (400Mbps vs. 480Mbps). Not many camcorders
have Firewire 800 connectivity yet, but this is expected to change over
the next few years.
Most digital cameras still use USB for image transfer. This is likely
due to the higher level of compatibility with current computers ?nearly
all have USB ports, while a considerably lesser portion have Firewire
ports.
External Storage: Firewire Wins
This category includes external hard drives, external optical
drives/burners, and generic external drive enclosures. Though USB 2.0
and first-generation Firewire are nearly neck-and-neck, Firewire can
provide much more power over the bus ?30V as opposed to 5V for USB,
which means that external Firewire drives frequently do not need a
separate power brick.
Many manufacturers of external storage devices now produce models that
include both Firewire and USB 2.0 ports for maximum versatility. These
types of devices are probably the best bet for both speed and
compatibility.
Conclusion
USB and Firewire both have unique strengths and weaknesses. USB's
ubiquity makes it ideal for devices that require high compatibility
with current hardware. Firewire's generous bus power and internal
architecture lends well to external storage and digital video
applications.
If you have any questions relating to this or any other topic, feel
free to post them on the Directron.org Help Desk.
Last updated: 03/24/05
If you find this article useful, please create a link to it:
http://www.directron.com/firewirevsusb.html
- from your website or tell a friend about it. If you have any
comments or suggestions about this article, please email
information@directron.com
A Quick Start to Debian
Debian is one of the main Linux distributions, and although not
designed
for the beginner is the basis for many newcomer friendly versions.
Aimed at the more experienced, it has the reputation
of being rock solid stable and, once installed, extremely easy to
maintain.
This page is not designed for the complete newcomer, but it is my
hope
that I'll save the more experienced Linux user a bit of time.
The information is all out there, but sometimes you have to dig a
little
bit to find it. (It's also to save me the trouble of looking up things
I forget if I have to install it on another machine--as one friend
said,
since it's true that once installed, you don't have to install it
again,
one does forget.)
Firstly, you have to install Debian. The easiest way is from CD.
Although they don't make it easy to find a complete iso, suggesting you
use jigdo, a bit of searching will usually find it. (Hint: try linuxiso.org.) However, using jigdo
isn't very difficult. One installs it (for example, in FreeBSD it's in
ports), gets a url from the Debian web pages and types something like
jigdo-lite http://us.cdimage.debian.org/jigdo-area/3.0_r2/jigdo/i386/woody-i386-1.jigdo
|
(If your browser broke that, it should be on one line)
After that, just press return when it asks for files to scan and it
will
ask for a Deb ftp site. You can type in (assuming you live in the US)
http://ftp.us.debian.org/debian/
|
and it will do the rest, creating an iso image to burn. (I use stable,
woody, as an example, one could use testing or unstable as well). Note
that jigdo does require wget. FreeBSD installs it as a dependency.
I just use Woody (Stable) as it seems to have more on the first CD.
I
downloaded Sid (unstable) once, and it didn't seem to really be an
install CD. (I just downloaded CD-1.) I did a quick google search, saw
that didn't
sound completely wrong, shrugged and just got Woody. I didn't research
it in depth at all, just did a quick google search, so take
that statement with a grain of salt.
Since, as you'll see below, upgrading on a reasonably fast machine
with
a broadband connection probably takes less than an hour, I didn't
research this any further.
Once you have the CD burned, installation, despite its reputation of
being difficult to install, is fairly straightforward. Again, this page
is not for the complete beginner, so I'm not going to walk you through
the installation. They have information on their web page. If you are
used to text or ncurses based installations, it is fairly
typical. Once the CD boots, hit F3 for the various options--I
choose bf24 which uses a 2.4 kernel rather than the default 2.2 one.
You'll have to pick your swap and / partitions, and if you're only used
to GUI installs, this might be initimidating. However, it's no more
difficult than say, Slackware, and actually far quicker.
Upon reboot, you are given various configuration options. One is
deciding what you will use for your sources. I usually choose the first
two or three ftp sites listed in the US. When all this is done, you can
log in. (I skip tasksel and the like during this part, it makes the
upgrade go
more quickly.)
As Debian unstable is far more stable than most distros' release
versions, the first thing I do is upgrade to unstable. (Assuming that
one installed the stable version.)
Note that you can actually install unstable from the beginning. I
learned this afterwards. When first installing, add verbose to your
boot arguments--for instance, I install the the 2.4 kernel so when
booting from the cd I type at the boot prompt
Then, upon first reboot, when choosing sources and the like,
unstable is
one of the choices.
For those used to source based distros like Gentoo, the upgrade of
an
entire distribution is amazingly quick. Debian uses binary packages.
The first thing to do is change the ftp sites to get unstable
sources.
In /etc/apt there is a file called sources.list. It will have lines
like
deb ftp://mirrors.kernel.org/debian/ stable main non-free contrib
|
Change stable to read unstable (but leave security at stable) then
apt-get update apt-get upgrade apt-get dist-upgrade
|
This will take awhile, and will require a bit of input while going
on.
However, when it's done, you have successfully upgraded from stable to
unstable.
Debian's apt-get is well known as one of the best package management
tools around. People have different ways of doing things and for me, I
usually install ssh next
This installs ssh and gives you the option of running the sshd
daemon as
well. It will also create keys. Next, I install a few other packages,
such as wget, sudo and the like.
Something that I've only noticed in Deb and its derivatives is that
sometimes, one way or another, a user is added to the sudo group.
(Being used to FreeBSD, I usually create a wheel group and use that as
the group that can do things.) If a user is in the sudo group, if they
run a command with sudo, they aren't asked for a password. So, if
that's happened, and you wish to change it, as that user
will work.
Next, I want to install X.
Debian's tasksel will do this for you. However, it installs, in my
opinion, far too many packages that I don't want. So, I just install
x-window-system and xlibs-dev. By now you probably have the syntax
figured out
apt-get install x-window-system xlibs-dev
|
As these are binary packages, it is far quicker than a source build.
After installation, it takes you through a dialog similar to
xf86config.
If you leave things as is, the next reboot will boot you up into X.
I
prefer to boot into text mode so I remove xdm from the start up
scripts.
update-rc.d -f xdm remove
|
In Debian, once it's installed, most things just work. For example,
if
you install mozilla-firebird and flashplugin-nonfree, Flash will just
work in firebird. I actually prefer opera, which doesn't seem to be
included (as of January 2004) in unstable (sid). So, I download the
static linked deb package, add the /usr/lib/mozilla/plugins path to its
plugin directories path and then install lesstif2. After that, Flash
works in Opera as well.
Sound and other things in the 2.6 kernel
This one took a bit of searching to get working. Upgrading to the 2.6
kernel and using ALSA took away my sound, which had pretty much worked
out of the box when I added the module for my card during installation.
Firstly, I installed alsa-base with apt-get. After that, in compiling
the kernel, as well as including my card, I added the modules for
SND_PCM_OSS SND_MIXER_OSS and SND_SEQ_OSS.
Then, add the following lines to /etc/modules.conf
alias sound-service-0-0 snd-mixer-oss alias sound-service-0-1 snd-seq-oss alias sound-service-0-3 snd-pcm-oss alias sound-service-0-8 snd-seq-oss alias sound-service-0-12 snd-pcm-oss
|
(This has, at times, if you compile the above options, been
automagically done for me.)
I also (but only once) had an issue when upgrading to the 2.6
kernel--suddenly, although DNS was working, I was getting connection
refused from the apt sites and couldn't get to web sites either. A bit
of googling indicated that it was the tcp_ecn flag. I fixed it with
sysctl -w net.ipv4.tcp_ecn=0 |
XFree 4.3.0
NOTE: (Since this was written XFree 4.3.0 is now in unstable.
Another thing to PLEASE note about the advice below is that
experimental
IS experimental.)
The downside of Debian's well-earned reputation for stability is
that
its packages are often slightly older versions. If one needs, or just
wants, Xfree 4.3.0 this is how I went about getting it. First, I added
this line to my /etc/apt/sources.list
deb http://http.us.debian.org/debian/ ../project/experimental
main contrib
non-free |
Then, run apt-get update. After that
apt-get -t experimental install x-window-system
|
(By the way, one can always search for packages at apt-get.org)
One of my mentors has created the following guide for determining
whether to run stable, testing or unstable. He pointed out, after
looking at this page, that despite my blithe assurances, much can go
wrong in unstable, which is why it is called unstable. I'm putting his
guide below.
Is it a server? Y - run Stable N - next question
Is it an important workstation? Y - run Testing, or even Stable if it's really mission-critical N - next question
Does it matter if this workstation actually works or not? Y - Next question N - Run Unstable, and install anything from Experimental that strikes your fancy.
Is it a workstation on which you can afford some possible downtime? Y- Run Unstable
N - Go back to the top and try again, or write another question :-)
|
There's a few other things that are a bit different in Debian. Many
of
them are covered in the distrowatch
article mentioned above. This covers, among other things, the
Debian way to build a kernel.
Debian's way to build from souce is in one of their faqs, but I
also have it covered in the Debian section of my fluxbox
page. I also cover the steps necessary
to get Japanese working in Debian in my Japanese
in *nix page.
The main purpose of this page is to save the person new to Debian a
bit
of searching.
An Open Letter from Hector Ruiz, AMD Chairman,
President and Chief Executive Officer
The microprocessor is the brain of every
computer, a
transformational
technology in today’s world. And as in all markets, innovation in the
microprocessor sector depends on competition – the ability of consumers
and businesses worldwide to
choose solutions based on one microprocessor over another.
Our competitor has harmed and limited competition in the
microprocessor industry.
On behalf of ourselves, our customers and partners, and consumers
worldwide, we have been forced to take action.
We have filed a 48-page, detailed Complaint in federal district court. Because, as
our Complaint explains exhaustively, Intel's actions include:
- Forcing major customers to accept exclusive deals,
- Withholding rebates and marketing subsidies as a means of
punishing
customers who buy more than prescribed quantities of processors from
AMD,
- Threatening retaliation against customers doing business
with AMD,
- Establishing quotas keeping retailers from selling the
computers they want, and
- Forcing PC makers to boycott AMD product launches.
For most competitive situations, this is just business. But
from a monopolist, this is illegal.
These serious allegations deserve serious attention. Earned
success
is one thing. Illegal maintenance of a monopoly is quite another.
Intel's behavior is much more than meets the eye. You may not
have
been aware, but Intel's illegal actions hurt consumers - everyday.
Computer buyers pay higher prices inflated by Intel's monopoly profits.
Less innovation is produced because less competition
exists. Purchasers lose their fundamental right to choose the best
technology available.
We believe the legal process will work. In the meantime, the
men and
women of AMD will continue to drive innovation, focusing on our
customers and on the people who use computers at home and work every
day.
At AMD, we know innovation.
We thrive on competition.
And we depend on a market based upon freedom of choice.
Read our Complaint. Demand innovation.
Choose fair and open competition.
Hector Ruiz
Chairman, President and Chief Executive Officer
Advanced Micro Devices
To share your thoughts about innovation and fair and open
competition with us, please e-mail breakfree@amd.com.
|
http://www.linux-mag.com/
http://www.linux-mag.com/content/view/2842/
Novell Looking for Acquisition Targets?
Written by Bryan Richard
Thursday, 04 January 2007
Matt Asay recent blogged about how Novell might be in the market to
make an acquisition this year in the virtualization space. He lists
XenSource and Altiris as possible targets.
If Novell wants to maximize the potential of their Microsoft alliance
and bring about a scenario like Canonical founder, Mark Shuttleworth,
outlined in a recent Red Herring interview...
Microsoft is going to claim that deploying Linux
anywhere, unless you pay Microsoft a patent fee, is a violation of
their patent and they haven't proved that yet. But they certainly seem
to be positioning themselves in such a way that they could do so.
... then you have to think they'll buy XenSource.
Why XenSource? Because it's at the heart of Red Hat's pending RHEL 5
virtualization features.
If you're into doomsday scenarios -- and you kind of have to be these
days -- you have to wonder to what extent Novell would be willing to
use as a competitive weapon the agreement with Microsoft that excludes
Novell customers from patent litigation.
If Microsoft has a patent covering Xen-like virtualization tucked away
somewhere in their intellectual property vault then Novell could use
that to plant doubt in customers minds about upgrading to RHEL 5.
Novell paid handsomely for that patent indemnification -- both in cash
and community PR -- you have to assume they're going to put it to use
and acquiring XenSource would put them in a position to leverage it.
Of course all of this idle speculation on a slow news day could amount
to nothing. But regardless of whether the intellectual property threats
are real or implied, the Open Source market seems to have graduated
from feature wars to information wars. The open solutions of 2007 could
start to be judged not just by if they solve technical problems but if
they also pass muster with a company's Chief Legal Officer.
And that's a shame. The last thing that Open Source vendors need is
customers asking, "Is it safe?" It's what SCO aimed for and failed to
accomplish.
But if done correctly, Novell could show SCO a thing or two about how
the game is played.
http://www.redherring.com/
http://www.redherring.com/Article.aspx?a=20495&hed=Linux%3A+Ubuntu+Founder+On+Microsoft+%E2%80%9CChallenge%E2%80%9D+
Linux: Ubuntu Founder On Microsoft
"Challenge"
Canonical CEO Mark Shuttleworth talks
why it may finally be time for Linux to out-innovate Apple and
Microsoft on the desktop.
December 29, 2006
By Falguni Bhuta
Taking a trip into space hasn't been Mark Shuttleworth's biggest
challenge. Instead the one-time space tourist counts building an
open-source company and working to hook users on the Linux as his most
testing venture.
Mr. Shuttleworth founded the Ubuntu project in 2004 to distribute a
free desktop operating system based on Debian Linux that would compete
with Microsoft Windows.
In 2002, Mr. Shuttleworth took a trip into space becoming the first
South African in orbit and the second space tourist ever. The
entrepreneur previously founded Thawte Consulting and sold it to
Verisign for $575 million, has headed a venture capital firm, and has
finally settled on pushing open-source software as his next career
move. The Ubuntu project is controlled by UK-based Canonical, where Mr.
Shuttleworth is the CEO.
Mr. Shuttleworth spoke with Red Herring about recent developments in
the world of open source and his plans for Ubuntu.
Q: How are the events in the open source industry in the last few
months affecting Ubuntu, if at all?
A: There were two big strategic announcements that were made this year
by non-Linux players. Oracle [said] that they will be providing support
for [a version] of Red Hat Linux without the trademark and Microsoft
and Novell would collaborate in a number of areas, and that
announcement had some interesting intellectual property considerations
so those both have been significant for us.
On the Oracle front, it hasn't really changed our position, because we
remain the only group that's focused on providing Linux free of charge
but on a commercially sustainable basis. So Oracle competing with Red
Hat may be the cause of the commercial distress without really
impacting on our strategy.
The Microsoft announcement is potentially more interesting. It is
setting us up for a situation where it will be almost impossible
potentially for Linux to remain a free platform, if Microsoft is able
to assert a considerable level of intellectual property ownership of a
Linux account. That will have a very significant impact on developer
participation and innovation in Linux, we feel that it is a very
significant challenge.
Q: So are you saying that if Microsoft finds a way it's going to be
hard for Linux [to] stay as a free piece of software?
A: Microsoft is going to claim that deploying Linux anywhere, unless
you pay Microsoft a patent fee, is a violation of their patent and they
haven't proved that yet. But they certainly seem to be positioning
themselves in such a way that they could do so.
They are really trying to get something to legitimize their claim, so
the deal with Novell had a lot of money attached to it. And as a part
of that deal, Novell is lifted up in its stand to legitimize much of
its claim. So it's a very interesting strategic move and there are a
lot of people on the other side saying that there are absolutely no
intellectual property issues with Linux that this is kind of a game.
Q: What are your thoughts, do you think that they have enough
Intellectual Property to threaten Linux users to sue them if they don't
use Microsoft?
A: It's very possible that Microsoft does have a patent amongst
hundreds of thousands of patents out there, which covers something that
Linux does, but until they come out and say which patent it is, it's
impossible to know. No one is ready to overrule the patents out there
[and] no one could possibly make a decision about that other than
Microsoft.
The other thing of course is that as soon as it's clear, if there is
some sort of infringement of Microsoft intellectual property, usually
it's very easy to work around that, essentially by re-writing the PC
software in question. So a lot of people are very confident that even
if there is something that is an infringement of a patent it could very
quickly be resolved. So Microsoft didn't actually want to be on the
hook for saying this is a specific patent that is infringed, because
then the Linux developers would work around it.
On the other hand, what they do want to do is they want customers to
feel slightly nervous of Linux. I think Microsoft is certainly sort of
becoming a smarter operator into how they interact with Linux and with
free software. They spent a lot of time saying it doesn't exist, it is
a toy, it is a cancer, and it is dangerous, and calling it
anti-capitalist, and now they seem to be engaging in a much more
realistic competitive pragmatic fashion to that problem.
On Desktop Innovation
Q: So do you think that Ubuntu, and maybe another player such as
Linspire, will have to eventually team up with Microsoft to avoid any
conflicts?
A: We would never pay a patent license fee to Microsoft, because we
don't believe that there are any patent issues, and our economic model
is essentially to make the software freely available. And if you are
making the software freely available, you can't obviously pay a patent
license to somebody else for a copy of the particular version. So while
Novell may feel that they can do a deal like that, Ubuntu would
absolutely never do it.
I think obviously we will be leading the charge in terms of showing the
weak points in Microsoft's assertion. It also means that we will be
able to represent at least a threat to Microsoft because not only we
will be competing for an end product but we will also be competing with
their way of doing business [because] we have a different economic
model, fundamentally different from theirs. There is no per-seat
license fee. If they call they can negotiate with us in the same
terms that they would negotiate with someone like Novell.
Q: You said it has been a big year for Ubuntu. Why do you say that,
what are some of the milestones that you have reached?
A: A couple of milestones, this is the year that we put out our first
release in enterprise. We still have a reputation over the last of two
years of focusing very heavily on the desktop so that made us a very
popular version of Linux for people who are Linux power users. We have
grown to the point where those power users were starting to say "I
would really like to start using Ubuntu on my servers at the company
where I work. But in order to do that, you need to provide support for
a much longer period of time" and so this is the year that we put out
our first enterprise quality release.
Also, that was the year where we formed a couple of very significant
partnerships and alliances. Our relationship with Sun was very
significant; for Sun Microsystems, it was a pragmatic way to embrace
Linux. They had some very interesting new hardware technology in the
form of a massively multi-core chip. The Sun guys are coming up with
processors that have six to nine cores. So they wanted to be able to
take advantage of that hardware in the Linux base, and we were able to
work with them on that.
Q: What about growth in adoption rates, any kind of numbers that you
can give me?
A: We know now that there are probably at least 8 million [Ubuntu]
users.
Q: Do you think 2007 is the year of Linux on the desktop?
A: It's been the next year for the last five or six. I certainly think
that we are seeing an acceleration in interest in the open source
community solving desktop type problems. If you [go] back five years or
six years, the people writing the core software in Linux were all
people who were responsible for service. [Now] we are seeing a real
shift in that the opens source community [itself] suddenly wants to
solve the desktop problems. They want to show that innovation in free
software on the desktop really can demonstrate exciting ideas that
people expect from their Windows or their Mac.
Microsoft and others, a lot of them say that free software and open
source is all about copying what was being done before in proprietary
software, and for a lot of time that was true. The world we are seeing
is that, as soon as the free software reaches a point where it's as
good as the proprietary software, suddenly all the innovation shifts to
the free software. Original innovation, and we saw that for [browsers]
where once Firefox had reached the level of parity with Internet
Explorer, suddenly it became this hotbed of innovation, and now it's
Microsoft that's scrambling to catch up with the free software browser.
Now, what may happen in 2007 is that we suddenly feel desktops being
very hot ground for innovation, and new ideas, desktop style ideas,
coming through in Linux suddenly made the proprietary software guys
feel like they have to catch up. Which is different of course [from]
saying that 2007 will be a year when all suddenly really want to switch
to Linux, but it could well be the year when suddenly Linux starts to
pull ahead in terms of innovation with the pace of developments in
change.
On Microsoft Vista
Q: 2007 also saw the launch of Microsoft Vista: how is that going to
affect Linux on the desktop especially. Do you think that Vista will
kill Linux on the desktop, or do you think it will actually make more
people adopt Linux?
Q: Well, that's going to get into a very complicated set of factors. On
the one hand, Vista is a very polished product, [it has good] features
there and [Microsoft] needs to be credited for that work. On the other
hand, it is expensive, more expensive than previous versions of
Windows. The way they have priced it and the number of different
versions of it, you will figure out what most people are going to end
up paying, it's gone up.
They are going to try to enforce their licenses much more aggressively
in parts of the world where that pricing is an issue, where people have
- until today - continued to use pirated versions of Windows. Suddenly
[they will] have to consider other options, and we do see that in
emerging markets in particular, Linux is taking off because people want
a stable platform that is cheaper. So in that sense, yes, Linux
might well benefit from the release of Vista from a pricing and
licensing control point of view.
In other senses, Vista is going to drive a whole round of applications
development, a whole round of other things, which will become
incompatible with Linux. In Vista, Microsoft has taken steps to break
compatibility with pieces of the Windows' infrastructure, which Linux
has already reverse engineered. So if you look for example at the
pop-sharing capability in the previous versions of Windows, Linux is
very compatible with that. So you can easily deploy Linux in the same
environment as Windows, and Microsoft has taken some very specific
steps in Vista to make that hard for Linux to do, so one thing, it is
going to be something interesting and competitive. I was somewhat
surprised at the low-key nature of the Vista release, I don't know if
you are seeing a huge amount of publicity of the same, virtually
nothing yet.
Q: What are your feelings towards Microsoft?
A: It is difficult to have a simple opinion about an organization of
about 75,000 people, they are always going to be individuals there with
very bad ideas and other individuals with very good ideas.
I think that they should be credited with competing with software
developers, and if we go back to the days before Microsoft, software
was enormously expensive because everybody customized their product.
[They] turned it into a real commodity for computing and for software
development, and at the same time they are a convicted monopoly.
I think there is an interesting thing that Microsoft is going to learn
and that is that maybe in the eighties and nineties, the most efficient
way to produce software was to hire the smartest guys in the campuses
far away. But maybe in the year 2000 and beyond, the most efficient way
to produce software is allowing people to gravitate to the parts of a
software environment that they are most interested in and then to
choose to collaborate in real time from wherever they want to be in the
world.
Which is why I am so interested in 2007, potentially, as the year in
which Linux innovation on the desktop starts to up-shine the innovation
of Apple and Microsoft. You've got to see that getting brilliant
people scratching their niches from anywhere in the world collaborating
on the Internet is not just a great way to make cheap software, it is a
great way to way to make a phenomenal, surprisingly good, and
unexpected software, breakthrough software. If you look at the
great companies that have been produced over the last ten years in the
technology sector, it is hard not to see that the vast majority of them
have found deep relationship with open source, such as Google, Yahoo,
Ebay.
[These companies] and others are fundamentally driven by open source
and they were built by people who were empowered to do what they could
do with open source.
Q: What is the way for Ubuntu to expand the use of Linux on the
desktop? Do you think working with proprietary companies to port their
codecs to Linux for multimedia use such as video and audio is necessary?
A: Sure, anything like that helps the extent that software people are
familiar on Windows or on the Mac is also available on Linux that we
have and that can cut both ways. One thing is to go to the proprietary
software companies and get them to port to Linux, but the other way to
do it is to take open source software and make that available on
Windows and on the Mac.
Having said that, this is by far the most complicated thing I have ever
been part of. I often sit here and wonder, will the world switch even
if we can produce a platform that is more exciting and more robust and
more distinctively virus-free, that is unlikely to get spyware and
completely made every single measure better. Will the world switch? I
don't know.
Q: Is it more complicated than, for example, going into space?
A: No, I had very specific responsibilities in the crew once I was
certified and trained for those responsibilities. In this case and in
many instances what we tried to do has never been done before. We are
trying to change the way people think about the economics of software
fundamentally, not just substituting one product, a $99.99 product with
another product that is $49.99, to change the way people think about
the economics of software and change the habit that the people have
with computers. So today I am privileged to be a part of it.
Q: What is your next big challenge other than open source and Linux?
A: I don't have a list. I am not going to be multitasking, I would like
to go deep into one problem and [get a] conclusion. I love this
project, so I hang out with many interesting people and I think we are
changing the world, so I don't daydream a lot about other things that I
am not doing at that time. When the Linux project is done, whichever
way it works out, what I'll do is take a look at the state of the world
and see what problems are interesting at that time. Everything in life
is contextual partly in place and partly in time, what's interesting
today is not going to be interesting when I have the time to see it.
-------------------------------------------