[sc34wg3] Merging/Viewing subject proxies
Steven R. Newcomb
sc34wg3@isotopicmaps.org
26 Jul 2005 13:33:07 -0400
Jan Algermissen <jalgermissen@topicmapping.com> writes:
> Patrick,
>
> On Jul 26, 2005, at 4:22 PM, Patrick Durusau wrote:
>
> > hhh = { < name = "rabbit, coney" >, < webresource =
> > "www.rabbitnetwork.net, en.wikipedia.org/wiki/Rabbit" >, <
> > classification = "Oryctotagus cuniculus" > }
> >
> > Of course I am presuming that the disclosure for "name" allows the
> > creation of a list of names and provides that if any of the "names"
> > in the list match, further viewing with other subject proxies that
> > have either "rabbit" or "coney" for the name property will occur.
> >
>
> Having spend about a year on implementing what happens when proxies
> merge and how the merged values demand further merges etc. and having
> especially tried to trim the algorithm for this stuff down to O(logN)
> I must say that the datatype magic you describe (here converting
> scalar to set as needed) is unlikely to be doable. The consequence
> IMHO is that most value types should come as sets in the first place
> (e.g. 'names' as opposed to 'name' in the example.
With my programmer's hat on, I agree with you, Jan. When I put on my
businessperson's hat, I agree with you even more. However, with my
ISO editor hat on, I can't see any reason to *prohibit* magic of
various kinds, even if we can't see any purpose in allowing it. The
number of things we don't know exceeds the number of things that we do
know.
> All this becomes really, really nasty when it comes to proxies being
> (parts of) values...
It's not nasty. It's just complicated. You have to think differently
than usual. As you of all people know, there is some pain involved in
adapting one's thinking to a paradigm that places subjects, rather
than objects, at the center of everything. The objects become
temporary, and references to them have to be managed, somehow.
They're manageable, though, as open source software demonstrates
(http://www.versavant.org, for one).
> This is not to say that the RM is not brilliant....I just think there
> is serious stuff in there that would need to be made explicit and
> proven as doable. (There might well be problems lurking in there that
> are not computable at all in finite time, dunno)
It's true that TMRM is not attempting to legislate good design. We're
just trying to make designs -- good and bad -- disclosable. Anyway,
the goodness or badness of a design depends on the context(s) it's
intended to serve. In some contexts, it's better not to spend much
effort on optimization, so a "bad" design that is suboptimal with
respect to the computing resources it uses is actually very good. In
other contexts, minimization of resource consumption is much more
important.
This reminds me of something. Many years ago, before the WWW, I had
my first meeting with Tim Berners-Lee. I was appalled that, according
to his thinking at the time, HTML was not intended to support SGML
<!ENTITY...> declarations. I told Tim, point-blank, that this was a
horrible mistake, because it would make HTML documents unmaintainable.
I said, "We'll have broken links everywhere, and no easy way to fix
them." He paid no attention to me. And, by the way, I guess he was
right. (And so was I, but my concern was irrelevant to the context of
his endeavor.)
Jan, I would like to explore your hunch that, in your contexts, there
is something non-doable about the TMRM. Personally, I have a hunch
that the non-doability will turn out to be due to the approach you are
thinking of using, rather than to any constraints the TMRM may place
on your disclosure of that approach.
-- Steve
Steven R. Newcomb, Consultant
Coolheads Consulting
Co-editor, Topic Maps International Standard (ISO/IEC 13250)
Co-editor, draft Topic Maps -- Reference Model (ISO/IEC 13250-5)
srn@coolheads.com
http://www.coolheads.com
direct: +1 540 951 9773
main: +1 540 951 9774
fax: +1 540 951 9775
208 Highview Drive
Blacksburg, Virginia 24060 USA