[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [wg-c] Unofficial report on L.A. meeting
Sorry for the delay. I've been traveling, visiting family, often offline, and
without means of sending e-mail even when I could download it.
My comments in the WG-C meeting.
On Monday Nov. 1 I spoke with Jonathan Weinberg, suggesting that it would be a
good idea for the members of the WG-C and other interested parties who were
present at ICANN/LA to use the opportunity gather together in person. I made the
same point Tuesday before the public meeting of the Names Council. The WG-C
meeting came together at 8 AM Wednesday.
There was also a kind of WG-C bar BOF on Monday night. Paul Stahura had been
very active there, putting together some interesting ideas. I hope he'll
eventually compile those notes and post them here.
These are the comments I presented to the group Wednesday morning, with
considerable embellishment.
1. The Self-Selection as Justification
During the course of discussion about how many and which new gTLDs should
ultimately be added, I again stated my preference for many, on the principle
that allowing people wide latitude in self-selection of a domain name is more
desirable than slotting DN holders into a pre-ordained classification. We can't
know beforehand how people will invest meanings or status in gTLDs, and the DNS
is sufficiently scalable (from a technical standpoint) that it's not necessary
to tell them how they should do so.
2. Why New gTLDs at all?
I then asked how we can establish some measure by which we might know more
precisely when there is some pressing reason to end this delay.
Though past attempts to build alternate root systems like Alternic failed to
gain much influence, and though new schemes like TLDNS seem to show little
depth, one is nevertheless obliged to ask how well the current system can scale
to meet user demand, and whether the limit of three open-access gTLD choices
will eventually be perceived as a constraint on growth and stability.
What if a coordinated demand for short names in a new gTLD eventually led to the
entrenchment of one or more alternate roots? I think this would be a lamentable
outcome of the current paralysis, since it would undermine the coherence of
Internet-based communication. While there seems to be no real problem now, the
situation can change over time. Therefore, it would be important to try to
monitor the relative popularity of alternate roots.
My understanding is that there is no technical reason to add gTLDs to the root
at the current time. The primary motivations to do so throughout the DNS
controversy have involved goals related to social engineering rather than
computer engineering.
Such goals include:
1) reducing the pressures to engage in speculative bidding for names in .com
which have led to cybersquatting and other signs of market failure (Mueller);
2) challenging the overall dominance of .com and thus reducing the financial
windfall that has accrued to NSI (Crocker);
3) privileging specific groups which might seek to consolidate their identities
by way of the DNS (Brunner);
4) creating and fostering a new competitive registrar industry around the world
to serve local clients (many, if not all of us here);
5) leveraging economies of scale to enhance the possibilities of name
self-selection (me, I guess), and;
6) self-interested desires to run rent-seeking, profitable registries (Ambler,
et al).
As we well know, these goals are not all compatible.
From a technical standpoint 'It ain't broke yet, so there's nothing to fix.' But
from a social standpoint, many people think things are definitely broken,
arguing they can be fixed by implementing various new gTLD models.
Of course, our TM friends believe that adding TLDs will break things.
Nevertheless, doing so would not undermine connectivity or interoperability on
the Internet. Yet it would change the nature of Internet-based advertising by
further reducing the utility of the DNS as an ad hoc directory service and
potentially raising the costs of performing TM surveillance. Again, these are
social rather than technical considerations. (I know the DNS is not meant to be
a directory service, but some people use knives as screwdrivers.)
3. What is to be done?
Finally, I offered what I now call the "the principle of the more feasibly fixed
fiasco." If and when new gTLDs are added, I believe it would be wiser to start
with non-profit/cost-recovery gTLDs rather than proprietary gTLDs.
The underlying presumption is that if either approach is later deemed to be
mistaken, it would be easier to parcel off a non-profit registry (say, by
auction) rather than to take the registry from a private owner. In short, the
public resource blunder would be easier to fix than the proprietary blunder.
The last four years of experience with NSI provides incontrovertible proof of
how difficult it can be to get a powerful proprietary registry to modify its way
of dealing with the Internet community. Consider the grief that has occurred
over questions of defining norms of conduct for interacting with registrars,
adhering to a community-supported DRP, maintaining whois accessibility, etc.
Chris Ambler asks for the same (very cushy) deal that NSI got. I say the
Internet community should be spared a replay of this debacle. Fortunately, Jon
Postel was wise enough to resist Ambler's pressure tactics. The proprietary gTLD
model has had its chance, and it's time for an alternate model to be given a
test.
I believe the choice provided by "the more feasibly fixed fiasco" principle can
expedite progress, given the paralyzing lack of consensus we have seen on the
issue of which registry model to adopt.
Thanks for reading.
Craig Simon