[OpenID] [security] Trust + Security @ OpenID
Eric Norman
ejnorman at doit.wisc.edu
Sun Jul 22 01:23:20 UTC 2007
On Jul 21, 2007, at 6:19 PM, Meng Weng Wong wrote:
> On Jul 22, 2007, at 6:14 AM, Eric Norman wrote:
>> On Jul 21, 2007, at 1:37 PM, Peter Williams wrote:
>>
>>> What we need now are protocols hooks and UI concepts that implement
>>> these raw technologies in a fashion that consumers can manage – and
>>> thus impose their view of trustworthiness on the world – as they see
>>> it.
>>
>> I will assert one thing that you can take as "Gospel",
>> if you so choose. This is not a problem that technology
>> can totally solve, but it can make a contribution. Ergo,
>> technophiles that connote things like, "We'll solve that
>> problem for you" are really doing the world a disservice
>> in the grand scheme of things.
>>
>
> Hear, hear. This is one of those messy situations where the
> problems are fundamentally social, and so we run into the old saw
> "don't try to apply a technological solution to a social problem!
> You! Will! Fail!".
>
> But our job is still to create technologies -- in this case,
> technologies that are explicitly social. What are we to do?
>
> The tradition of "technology is value neutral!" says our mission is
> to expose affordances that permit human psychology to express itself.
I have a suggestion that might help keep it neutral, but it's
way too radical. What if technologists would completely
expunge the word "trust" from their vocabulary?
And affordances? I don't think you mean that in the same
sense that Don Norman (no relation) does when he writes
about human engineering and usability, but it's close.
> Could you imagine if programmers wrote programs only for other
> programmers?
I find that easy to imagine (cynical me). Isn't that what they
actually do? Isn't that the audience programmers play to?
Well, at least the majority. :) :) :)
> We've learned the hard way that end-users use applications in ways
> that don't always make sense to the programmers who build them
And we're starting to learn that the mental model that a user
always forms about how something operates hardly ever
matches the model that programmers have, and that that's
an important awareness to have. That's more commentary
about usability and human factors. See Peter's use of the
phrase "UI concepts" above, along with the "as they see it" part.
> The recent "libertarian paternalism" exchange between Sunstein/
> Thaler/Mitchell is worth checking out.
> http://en.wikipedia.org/wiki/Soft_paternalism
Indeed it is. Thanks for the references.
Eric Norman
http://ejnorman.blogspot.com
More information about the general
mailing list