Monday, August 11, 2003

Patrick Logan asks about Sharepoint.  I've added my response to the C2 wiki, but it's reproduced here. 

At my employer, we used SharePoint Team Services for the last project, and after 2 years, I've decided it was an abject failure, and essentially degraded to a fancy network share.  There are probably some cultural problems behind this, but for me, one of the killers was that it requires a "media switch" for the document libraries.  I suppose it's useful to be able to have Office docs on the site, but the bulk of our content is just text and there's really no way in STS to do linkage between documents.  I could never get the "comment on this document" function to work either.  Also, I found the email change notifications pretty much useless, while the RSS feed that MoinMoin has is at least somewhat useable, though I'd prefer a full-content feed.  One thing that we did was restrict viewing to only members of the development team, so we could have open debates without outside interference, and then we had a public STS site for everyone.  I think that ultimately, what we wanted is a way to conduct a discussion and communicate, and a public site to serve as a repository for our formal output.  I think that's the crux - SP seems to be oriented towards a more formal site, but that tends to be a little stifling.

I just helped a team set up MoinMoin for a new project, I'm very interested to see whether Wiki does better for us than STS.  I wish that I'd set up a wiki for my current project, I'm having to learn a whole new domain and the linkages between areas, I've frequently found myself taking notes in a meeting and thinking that a Wiki would really pull my notes together well.  Also, MoinMoin has the significant advantage of being free, so it's a very low cost way to experiment.  I can't imagine being able to do that with the costs associated with SharePoint. 

In response to the specific question, I haven't looked at the latest version, but I've talked to others who have, and they say that it's essentially the same to what SP was 2 years ago, just with a few bells and whistles. I should add that what SP does is add some role based security, very rudimentary for STS, more advanced for SP Portal Server (IIRC, it has roles like "Author", "Approver", etc.). Also, the document libraries have a mechanism for creating "views". For instance, we have a well defined load process (because our data center management is outsourced). So we have a document for the load instructions for each load, and then in the view, you mark the document as "completed" and the default view shows only the documents not "completed". So I think that SharePoint is probably a good tool if you have a process or workflow that you want to support, but less good at open-ended communication.

12:31:43 PM  permalink Click here to send an email to the editor of this weblog. 

Ted Leung points to the latest by Paul Graham.  I had some thoughts along these lines a while back, but dismissed the idea because the idea of vigilante justice seems simply wrong.  However, Paul does mention retrieving the URL embedded in a spammy email as a way to do additional verification, and in this context, I think it could be OK.  Still, I have to wonder about things like whether an email classification program should respect robots.txt.  Paul mentions the idea of forging the User-Agent in the retrieval request, which seems to me to be somewhat unfair.  I know, "unfair" in the context of a spammer, who could very well be a scam artist, is a wierd idea, but I keep thinking that this sort of behavior is ultimately a bad idea.  If your email agent is banned by User-Agent, or gets a 404 or some other error indicating that the link's bogus, I'd say that's enough to throw the email into the spam bucket.  Another weakness in Paul's idea is that it might require a whitelist to prevent flooding innocent urls; for example, the Yahoo! Mail or Hotmail auto-signatures, or any other URL that the spammer might throw into the text to throw a filter off the scent.  Such a whitelist would likely need to be centralized, but the beauty of Paul's original Bayesian filtering proposal was that it's decentralized: ultimately, filters are a personalized thing.  Still, Paul's newest essay is nothing if not thought-provoking, and in what seems to be typical for his writing, some of the best thoughts are in the footnotes.  I especially like the offhand reference in the 2nd about writing an intelligent agent to "talk" to MLM and 419 scammers - Paul didn't expand on it, but I believe that the idea is to use up a spammer's resources talking to a computer instead of a human.  I've read accounts of  people who've done this to 419 scammers, and I have a friend who's responded to several obvious scams for fun, stringing along the scammer for a while.  That could be a really interesting research project.  Maybe I should go back to grad school.

10:36:05 AM  permalink Click here to send an email to the editor of this weblog. 


Stories
DateTitle
1/23/2003 Why XML?
8/13/2002 Resolution for IE and Windows problems
8/10/2002 Supporting VS.NET and NAnt
5/11/2002 When do you stop unit testing?
Contact
jabber: weakliem@jabber.org
YM: gweakliem
MSN: gweakliem@pcisys.net
email: Click here to send an email to the editor of this weblog.
Subscribe to "Gordon Weakliem's Weblog" in Radio UserLand.
Click to see the XML version of this web page.