The Global Cycling Network is arguably the most famous provider of cycling-related video content out there. What started five years ago as a concept of a startup communications agency turned into a rather large YouTube enterprise. For road cycling, there are now five presenters plus a Tech guy and the occasional cameo of the man behind the camera. GCN has launched sister channels for mountain-biking (and e-mountain-biking) and triathlon as well as its own spin-off, the GCN Tech channel.
It’s gotten a bit commercial along the way: GCN produces mostly sponsored content and has a very strong bias towards brands with large marketing budgets and high price markups. Well, that’s what you get if it’s the work of a communications agency. Still, their factory tours are fascinating content. Their training videos provide decent recommendations for beginners. (Some seven-tips-to-ride-a-bike I don’t watch, as they’ve gotten repetitive or a bit narrow-minded.) They tell some interesting stories about the life of pro-riders. Their weekly GCN show is an easy way to stay superficially informed. Overall, I’ve learned a lot from them, and I really like the content. One great piece, for instance, is this short documentary on the pressing problem of contaminated nutritional supplements.
But I do not like everything. And one thing not in particular.
One element of the GCN show is a discussion of latest scientific insights related to cycling. This could be about health benefits, financial benefits, time savings for commuters – really, anything. Even how cycling might impact your dating behavior or your coffee consumption. I really appreciate that generally they try to base their discussions on actual scientific output. It just bugs me how they do it. So my hope is that some guy from GCN actually reads it and gets thinking. I focus on three issues.
1. A running gag can be fun, but it can also give all the wrong impression
Whenever GCN bases a claim on published research, one of the presenters will introduce the study and name the journal in which it was published. The, the other presenter will respond with a remark that this is one of his favorite journals ever, and that he’s a regular reader. I don’t know where it started, but it’s an obvious running gag. So be it. But unfortunately, some journals are bad journals, some journals are good journals, and most journals are something in between.
Here are two examples. The videos start at the right marker.
The first cites a study by Thomas Gaither and nine other colleagues in the Journal of Sexual Medicine. Dan Lloyd describes the journal as: “which is probably one of our favorite journals, probably #1; we do like reading it most days”.
This journal has an impact factor of roughly 3.1. The impact factor of a journal is a crude, imperfect, actually terrible, but better-than-nothing measure of a journal’s reputation. Basically, it measures how frequently articles of said journal are to be found in other articles, i.e. how much it influences the scientific debate. Impact factors vary by field. Across fields, the average impact factor is ~1.1, and it’s about 1.4 in medicine. With an impact factor close to 3, the Journal of Sexual Medicine is a very good albeit not top-notch journal. On average, we might conclude there’s a fair chance it’s reasonably impactful research. Which (we hope) says something about how good or bad the research is.
On to the second example. This time it’s a study of Niharika Duggal and four others that they published in Aging Cell. Aging Cell has an impact factor of ~6.7; that’s a lot. Emma Pooley names the journal, then Simon Richardson responds with a big smile. “Ah, Aging Cell. Definitely one of my favorites. Love that journal.” By the way, they then indicate the title of the paper, but no single author name. I’ll get back to that in a moment.
Si’s smile might confess how much he’s enjoying himself. To some, it might express irony. This might well trigger the opposite effect of what GCN likely intends.
2. Authors, not journals deserve the credit
Admittedly, this issue is not exclusive to GCN, but to most media outlets that communicate science. A typical introduction of a scientific publication might read: “A study in the journal X found” or “A study by researchers from the University of Y tested”. Very, very rarely will someone go to the extent to name the actual authors (which, at the same time, might be affiliated with a diverse set of universities due to international collaboration). Take GCN again as an example: Emma Pooley reads out the full name of the article, which is more than five seconds of precious air time. Names mean nothing.
You need to understand how the publication process work: Big publishing companies run the journals. This means they provide the web space, they do the printing, and they distribute the press releases of those articles that they deem most interesting. They set the price and they determine the number of issues per annual volume. They have absolutely no influence on the content, other than that they can influence the editorial board. The editorial board consists of members of the scientific community. They appoint the reviewers of each article (again: they are peers, i.e. researchers that work on related topics) and they take the ultimate decision whether or not a study should be published. The actual hard work of doing research… the journal has nothing to do with that.
Press mentions of names can make careers. Social media presence of academics grows increasingly relevant. To that end, some smart academics (Paul Krugman was one of the first) started to write very amazing blogs to discuss in more depth their research. (Other academics very irrelevantly blog about gadgets instead…) In any case, we want to be found. Name recognition is a valuable currency in the fight for promotion and for resources for further research.
Please do them the favor. I know that author teams have long lists of names. In my field, five is a lot, but in medicine, ten is not even all that much. I don’t urge GCN to name all of them. Speak of Niharika Duggal and her team. And then provide the full list of names in the description. Which brings me to my third point.
3. Citations need references to be useful
When my students write their theses, a sentence like this frequently appears: “Previous researchers have shown that [etc.].” My feedback asks them to add a reference so that I actually can trace which previous research they are talking about. If they don’t provide that… well, I could always claim that someone said something, who should believe me?
Or think about it from the perspective from an interested reader: that study in the Journal of Sexual Medicine on the benefits of cycling for women’s sexual health sounds interesting? GCN viewer Annemiek would like to read it. Emma googles for it. Here’s the first page of search results.
Hit 1 is an article from 2012. Hit 2 is an article on the study that Tom Last and Dan Lloyd discuss in the GCN show. That article also provides a direct link to the study. Hit 3 actually addresses male sexual health. Hit 4 is way off topic. Now, I tell you these things. How would Annemiek know for certain?
For that reason, citations need accurate references. Arguably, that is more difficult in video content than in written pieces like this blog. But also not. GCN includes extensive descriptions with links to external sources, surveys, and promotional contests. To me, it’s just one easy step to include a link to each scientific study that they refer to. After all, if these are their favorite journals, I’d assume they’d want others to join them in the reader community.
Wrap: GCN is good, but GCN: get awesome!
And I think that sub-headline says it all. From a current good level of debating scientific insights (I like those segments in which they present more than one paper on a specific topic), they can easily improve to awesome. It’s only a matter of being serious when irony could be misunderstood, giving credit to their actual content creators, and showing transparency over their sources.
Do that, and the hashtag #gcndoesscience surely would feel more appropriate.