Updated: 4/1/2005; 10:29:33 PM.
Berlind's Media Transparency Channel
If you're looking for my podcasts, please read What to do if you're looking for my series of podcasts on IT Matters. Otherwise, read on.

This blog is now a part of my experiment in media transparency. The premise is that if the media can broadcast polished edited content through one channel like ZDNet, then why can't it also broadcast a parallel channel that's full of the raw materials (thus, this "channel"). For a much more detailed explanation, be sure to check out the following:In case you're interested, maintaining a simplistic transparency channel like this one has so far involved a significant amount of heavy lifting. The core technology may exist, but it's my opinion that a decent UI for publishing a transparency channel does not. So, one outgrowth of this experiment might be a complete specification for such a system -- Something I call JOTS. Finally, as a student of media, convergence, and technology monoculture (three very inter-related issues, if you ask me), I'll be blogging any news that comes my way that I think is relevant to the media revolution that's upon us (the one that many media executives are in obvious denial about).
        

Friday, April 01, 2005

It appears as though Microsoft and research outfit Security Innovations are learning the hard way how lack of research transparency can damage the credibility of a research project as well as those involved. The study in question, performed by Security Innovations, concluded that Microsoft Windows Server 2003 has Fewer Security Flaws than Multiple Configurations of a Comparable Linux Server." Controversy erupted over the report when it was discovered last week, well after the report's results were orginally unveiled at February's RSA Security Conference, that the research was commissioned by Microsoft. That important detail was not disclosed when the results were originally reported to conference attendees, leading Counterpane Internet Security founder Bruce Schneier to tell the Seattle Post-Intelligencer "It was evidence that Microsoft was doing better, and now the evidence is tainted....The results might be accurate, but now nobody's going to care, because all they'll see is a bias that was undisclosed."

Though the disclosure is made in the report, Security Innovation's summary page that describes the research still makes no mention of who funded it (by the time you read this, that may have changed). It should because there are plenty of IT buyers who pay no attention to research that's commissioned by vendors and they have a right to know, without having to dig into the report, whether or not the report was funded by a vendor or not. According to the Post-Intelligencer's report, the researchers behind the effort claimed full transparency saying "Our own requirement for the methodology was that it had to be very open and transparent. We wanted to give people the recipe so they could go out and recalculate the numbers for themselves." The researchers also maintained that Microsoft was not allowed any editorial control over their methodology.

Unfortunately, that sort of transparency, sans the information regarding who funded it, doesn't paint the complete picture of commissioned research and here's why: What happens in the case of commissioned research that doesn't find in favor of the vendor that sponsored it? In my 15 years of journalism, I can't recall one time where I saw a commissioned study that concluded in favor of the sponsor's competition. Can you? (please write to me about it or comment below if you can). I've seen commissioned studies where the competition wins on a few points, but never overall. While I can't say definitively that such a study has never been made publicly available, I've heard plenty of stories about how such studies do exist. But, since a vendor is paying for the research in the first place, it has every right to take the study home and stick it in a file cabinet where it will never see the light of day. Or, the studies are used to help vendors figure out what they need to do to beat competitors (and are sometimes be cited in vendor presentations as a way making some forthcoming product upgrade sound credible and market leading to the press).

Also, I have another nit about methodology transparency. Disclosing the methodology in such a way that allows others to reproduce the results is definitely better than not disclosing the methodology at all. But that misses the larger goal of transparency. Transparency is also about change and improvement. Had Security Innovations announced that it was performing the research study on behalf of Microsoft, published the proposed methodology, and invited other security experts to make suggested changes, and then taken incorporated those changes into the methodology (or explained why it didn't), then that would have made the final results much more defensible (and trust me, if word got out that a research outfit was seeking feedback on a methodology for a Microsoft-funded comparison of Windows and Linux security, the researchers' suggestion box would have been overflowing with "help").

So, just making the methodology transparent as Security Innovations did isn't transparent enough, if you ask me. Not only does real transparency involve the disclosure of the methodology and who funded the report (on the front page), but it should also include the contract verbiage that describes what control the vendor may have over the methodology (reminder: in the above case, the researchers are claiming Microsoft had none) and what veto power, if any, the sponsor has over the publication of the research for public consumption.

Finally, the ultimate consumers of the research -- the people who make buying decisions based on it (whether they had direct access to the research or it trickled down to them in some other way) -- aren't the only ones who would benefit from full transparency. The studies and those involved (the researchers and vendors) benefit too. After all, when everything is above board and more defensible, it can strengthen the reputations of those involved. But the minute lack of transparency becomes an issue around a particular piece of research, then everyone goes down with the ship. That can't be good for business.

10:24:11 PM    comment [] RadioEdit

Dan Gillmor picked up on an important report for the Carnegie Foundation. Entitled "Abandoning the News." For any mainstream media (MSM) outlet that's doing some soul searching (which they should be doing) and that's looking for some survey data on the perception of their medium (newspaper, TV) versus the Internet, this report is backed by a revealing PowerPoint presentation that gets into the heads of 18 to 34 year-olds (obviously, a very important group). 

Why the established media should care: In no small way, it articulates the the challenges that the MSM will face as a result of democratized news provision.

Relevance to transparency: First and most important, trustworthiness is listed as one of the top three criteria in selecting information sources (it appears at the top of the list but it's not clear whether list order is an indicator of importance). Timeliness is listed second. While the Internet gets high marks for timeliness, it's perceived to be the second worst in terms of trustworthiness. So, given that transparency is about credibility and trustworthiness... well, now you understand the relevance.

9:34:38 AM    comment [] RadioEdit

© Copyright 2005 David Berlind.
 
April 2005
Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
Mar   May


BlogRolls
 New Media Thinkers
 Media & PR Transparency
 Vendor Blogs
 Tech Guru Blogs
 Tech News Sites
 ZDNet Blogs
 Other Cool Peeps

Click here to visit the Radio UserLand website.

Subscribe to "Berlind's Media Transparency Channel" in Radio UserLand.

Click to see the XML version of this web page.

Click here to send an email to the editor of this weblog.

Technorati search

Top 10 hits for media transparency on..
Google
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.

Help link
 4/1/2005; 10:24:13 PM.


Categories and Current Editorial Projects*