[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [wg-c] Re: nine principles for domain names
At 04:35 PM 2/21/2000 -0500, Milton Mueller wrote:
> > "We believe that the requirement that all gTLDs have associated
> > descriptions of their intended meaning."
>Bret Fausett has just decisively answered this question. It is not only
>possible,
>but highly likely, that the most popular new TLDs will be ones that are
>notable
>only for their memorability, not for their "meaning." E.g., what is the
>meaning of
It is always fascinating to see definitive statements about the future,
particularly when there is no objective basis for treating the statements
as anything other than pure conjecture.
As to "memorability" vs. "meaning"... This attempts to draw a stronger
distinction than the context allows, especially since this context is not
one limited to psychologists and/or linguists. The distinction also might
be wrong.
That is, this discussion group is not one for highly precise linguistic
distinctions, absent careful definition and group agreement. One of the
sources of conflict and confusion in these discussions has been from one
person, or another, trying to assert a particular, precise definition
(legal, technical. or the like) when the group, overall, is using the term
more loosely.
Then we get to the question of "memorability". Many things aid the ability
to remember. Semantic association is one of them, and a particularly
strong one for "strings". I suspect that the problem, here, is an attempt
to treat strings that do not appear in a dictionary as having no "meaning"
but possibly having some memorability. That is a far too precise and
constraining a distinction for this discussion (or just about any other, in
my opinion.) "XXX" is not in a dictonary but has nearly globally-clear
meaning.
The modified text I suggested calls for a gTLD to have some associated text
which describes its intended use. The association between that text and
the gTLD string is the intended "meaning". The presence of that meaning is
intended to underscore the mnemonic potential for the gTLD.
I suspect what Milton is forgetting is that all strings with a "meaning" do
not have it inherently. It is through association with other things in the
world that we impart that meaning. Hence, there is nothing in the least
wrong, or even unusual, to refer to the string "zzz" as having meaning,
given a legitimate associative context.
>.zzz? Who would want to register there? No one knows for sure, but I feel very
>confident that such a TLD would end up with a lot more registrations than the
>plonky .store proposed by IHAC. Same goes for a string of Chinese
>characters, which
Milton's confidence is nice, but unfounded. That does not mean he is wrong
-- though *I* am confident he IS...
More importantly is that the interesting question is not whether it would
be more popular, but why it would be. I am confident that a serious effort
to learn the answer will disclose it to be that the string has a particular
MEANING for its target audience.
>may evoke several meanings.
More than one meaning is just fine. There is nothing wrong with a gTLD
having the encoding efficiency of poetry.
> > What makes the small group of registry operators better than other bodies
> > for doing the selecting?
>
>The simple fact that it is their investment at risk. They are bidding
>their labor
>and their money that people will want to register under a specific name.
>Registry
Right. Like NSI, using a granted monopoly.
Sorry, but they are plugging into an inherently restricted environment and
an environment which must display broad-based stability. That is, stability
is not just a question of whether servers respond, but whether the entire
DNS operations and administration cycle is stable.
>operators are directly accountable for what they offer in a way that ICANN and
>certainly this little working group can never be. If we make stupid or unduly
NSI is "accountable" to its shareholders, not its users. It pretty much
has ignored is customers, with improvements coming only as its monopoly
control over them has been threatened. That threat came from controlling
authorities, not "the market place."
>restrictive decisions, what are the consequences to the members of this
>committee?
>There are none. It doesn't affect our paycheck or our workload in any
>significant
>way. If on the other hand a registry offers services that no one wants,
>there is a
>substantial loss. This is true regardless of whether the registry is for
>profit or
>not.
That's fine, if you only look at things from the perspective of the registry.
It's not so fine if you look at it from the perspective of the customer.
> > Anticipating a "let the market decide" line of thinking will get us to
> > companies' experimenting with the marketability of a string and then going
> > out of business when it is not profitable. At that point, the unfortunate
> > organizations that chose to register under the TLD will be left without
> > service.
>
>Several fallacies here. First, zone files can easily be escrowed and
>transferred to
>another operator. Any contract between ICANN and TLD registries can be
>structured
That's like the cliche that fills the upper-left and lower-right of a board
with equations and ties them together with the phrase "let a miracle
happen." The suggestion sounds easy, because talking about it IS easy, but
the details to make it happen are rather more difficult.
World peace is easy. It only requires a simple agreement and a simple
enforcement mechanism.
In addition, the hand-wave includes a technical error: zone files do not
contain customer record information. If this registry failure-handling
mechanism is to work, it must transfer all relevant customer data, not just
what is in the zone file.
So the word "easily" is not quite so easy to apply, here.
>to minimize this risk and to compensate for it when it happens.
>
>Incidentally, similar risks occur if your ISP goes out of business, but I
>don't see
ISPs rarely have any sort of monopoly control over end-users, since the
cost of changing ISP is typically small. For that matter, one reason to
have a domain name is to reduce the ISP's control, by dis-associating the
customer from the ISP's own domain name.
I've changed ISPs a number of times, but it has been transparent to those
dealing with me. That would not have been true if I were forced to change
my domain name.
> > This gives a good indication of the reason that DNS service needs to be
> > counted as an infrastructure and the sole-source technical constraint --
> > only one organization can be registrar for a given TLD -- mandates that
> > operational stability included assurance that TLDs persist. Going out of
> > business is an extreme form of instability. We do not tolerate it for
> > telephone numbers or street addresses; why tolerate it for domain names?
>
>This is a regulatory issue that can be and is handled at the national
>level. At the
Huh? gTLDs are not "national" entities. They are global.
>national level, we have professional regulators functioning according to
>established laws with significantly more resources than DNSO. Telephone
>numbers are
>unique globally, as are domain names, but the ITU committee that
>administers the
>numbering plan doesn't license all world telephone companies, nor attempt
>to tell
In the case of telephone numbers, the ITU delegates to national
agencies. They, in turn, delegate number space below them. So the ITU
very much DOES control allocation of the first-level portion of the number
plan.
In the case of gTLDs, ICANN does NOT delegate to national agencies. It
does need to delegate to some set of agencies. And it needs to entail
control on a par with what the ITU does for telephone country codes.
>them how to run their business. It is simply a coordinating agency. That
>is ICANN's
>only legitimate role.
If you think the ITU is only a "coordinating" agency for its assigned areas
of responsibility, I suggest you would greatly benefit from spending time
in some ITU committees...
> > [Re: numbers of new TLDs] High estimates are VERY high. Low estimates
> are in the
> > thousands. If we believe the low estimates and they are wrong, we can
> add more
> > names
> > safely.
>
>OK. Let's start with thousands, then instead of millions. See, Dave, I'm
>always
>willing to compromise. And since the largest list of proposed TLDs I've
>seen does
Milton's declared flexibility notwithstanding, an upper-limit of
"thousands" has been the accepted (lower) reference for several years.
>not exceed 300, we can conclude that there is no need to make restrictive
>choices
Interesting number, 300. That was the preliminary limit in the Postel
proposal. So the current list would consume ALL of the space originally
proposed.
In any event, taking a current list of 300 and somehow extrapolating low
demand for names from it suggests a very different approach to system
management (and market projection) than any responsible operator would take.
A responsible operator must anticipate greater growth than is "obvious"
since they cannot afford to have too little capacity. They must make
estimates based on high-end possibilities, asking "what if" things grow
faster than they have planned.
>as to which ones are allowed and which are not. Scarcity in the economic sense
>exists when demand exceeds supply. Your lowest estimate vastly exceeds known
>demand.
Let's see. This "known demand" is based on an operation that had no growth
(no new gTLDs) since the net became a consumer market.
So there is no objective basis for guessing what will happen if the
floodgates are opened without any controls on them.
That's a very interesting way to manage a critical piece of
infrastructure. Just a tad more fragile than a responsible operator would
indulge in.
> > If we believe the high estimates and they are wrong, we break the
> > DNS (and, therefore, make the Internet useless for almost all users. Hence
> > there is a requirement to approach the question of name space size with
> > significant caution. It's fine to call that constraint "artificial" but
> > the label does not make the constraint any the less advisable.
>
>According to your comments above, you consider a few thousand new TLDs as
>being on
>the extreme low end of caution. Since this WG is proposing to start with
>6-10 and
>add on more at the rate of 50-100 a year, this is really a non-issue. We
>are being
>cautious to a fault.
Only if one constrains their concerns to the technical, name-space capacity
of the root.
Remember that "stability" entails a broader set of issues, including DNS
administration. For THAT, alas, we have a very poor expansion history --
as in none -- for gTLDs.
At 03:36 PM 2/21/2000 -0800, Karl Auerbach wrote:
>This layer of indirection is the main virtue of DNS. The fact that humans
>have (temporarily) been involved in typing in DNS names, which has caused
Slipping in the parenthetical (temporarily) presumes a change in computer
and network use that has been projected for nearly 20 years. It has yet to
occur and there are good reasons to doubt that it ever will. Business
cards are one example. You need to be able to put useful strings onto
business cards. The holder of the card must be able to use the string.
Perhaps everything will be entirely automated in the future, but that is
not going to happen for the global Internet anytime soon.
Changes to infrastructure mechanism happen slowly. Changes to a global
infrastructure happen VERY slowly.
>all of this debate, is merely an ancillary characteristic of DNS. And the
Ancillary, in Karl's opinion. Others view the mnemonic feature of domain
names to be an extremely important benefit.
Remember that hostnames were introduced at a time when the user population
was a small set of geeks who were comfortable debugging programs in
octal. That community thought dealing with numbers was just fine, so there
must have been some reason the layer of abstraction was chosen to be
human-oriented "semantic" strings. That reason was not minor then and is
not minor now.
>end of that human relationship with DNS is something that will slowly and
>substantially diminish as true directory services become more of a true
Again, please note that that claim has forcefully been put forward since
the beginning of the X.500 directory service standards effort, roughly 20
years ago, and it is not yet close to fruition.
>And when one says "stand for something" one is saying something about the
>linguistic context of a human user. And given the billions of people on
>this planet who speak non-European languages, any choice of linguistic
>context is necessarily going to be the wrong choice for a significant part
>of the world's population.
1. And it will be RIGHT for a significant part.
And there is nothing horrible about having some disparity there.
2. People are pretty good about acquiring and using "foreign" terms.
The Malaysian term for table is "meja"; that's Portuguese. An
American term for "a little bit" is skosh; that's Japanese. And then there
is the global term, Coca-Cola...
The fact that a DNS string might not be immediately intuitive to a
user segment is not an automatic impediment. There is a yin and yang
balancing act when choosing names...
Although there is merit and convenience in Winer's observation
about the global prevalence of English, I do not believe that point is
essential to this issue.
d/
=-=-=-=-=
Dave Crocker <dcrocker@brandenburg.com>
Brandenburg Consulting <www.brandenburg.com>
Tel: +1.408.246.8253, Fax: +1.408.273.6464
675 Spruce Drive, Sunnyvale, CA 94086 USA