Its all very well to bitch and moan about "tech industry controlling conference funding", but the reality is that conferences are expensive and the money needs to come from somewhere.
Speaking from experience. Even a decent sized "basic" conference (i.e. no fancy stage setups etc.) will still cost you an arm and a leg.
You need to:
- Pay the venue for the space
- Pay the venue for the refreshments during breaks and meals (unless you are mean and don't feed people !)
- Pay for a basic AV setup
- Deal with registration
- Pay for hotel rooms and travel for your team
- Pay for various other things that soon add up (e.g. transport storage costs if you are shipping stuff there a few days before)
So you might say, well, how about going 100% digital. Well, trust me, the good platforms know they are good and they charge accordingly.
Ok you might say, "well, we'll charge registration fees". Well sure you might, and sure that might well cover 100% of your costs. But have you ever seen how registration goes for a conference ? It takes time for the numbers to ramp up. In the mean time, you need money in the bank to pay for stuff you need to pay for "now". And you need money in the bank as security for the contracts you'll be signing with the venues (if nobody turns up or fewer people than expected, the venue will still want some money off you).
So, its then a question of where the money comes from. And like it or not, corporate sponsorship is typically the easiest way. The corporate structure "understands" what a conference is, so you won't get bogged down in discussions. The corporate way is also the easiest way to get nice big chunks of cash instead of having to beg tens or hundreds of different people.
I knew some people who ran a popular regional tech conference. You’re exactly right that it’s incredibly expensive to run anything like this at a traditional venue.
As far as I could tell, the sponsors had zero additional asks. They paid money in exchange for the pre-determined advertising package to have their logo printed everywhere. At a certain level they got a booth where they could hand out swag and recruit. They didn’t direct the conference itself, though. Maybe it’s different at the mega-conferences, but locally there was no indication that the sponsors were trying to capture the conference.
The financial risk was relatively serious. At first they tried to keep ticket prices as low as possible but ended up losing money after things like refunds (more then expected) and surprise expenses (also more then expected).
Conferences are also a lightning rod for drama. No matter how much they stayed ahead of current trends, there was always someone trying to stir up conflict and drama over something related to the conference or the speakers or the topics. If they gained enough angry supporters it resulted in a wave of refund requests that could ruin the profitability.
Ultimately they ended the conference. I wouldn’t be surprised if they lost more money than they brought in over the years. The actual benefit for them was in their careers. Being the organizer of a medium size tech conference is an easy way to make your resume look very impressive. One of them landed a series of impressive jobs and then was quietly let go from most of them because being a conference organizer doesn’t necessarily translate to being a good manager, but that’s a story for a different day. I’m sure he can still walk his resume into most companies and get it to the top of the pile based on his conference activities years ago.
> As far as I could tell, the sponsors had zero additional asks. They paid money in exchange for the pre-determined advertising package to have their logo printed everywhere. At a certain level they got a booth where they could hand out swag and recruit. They didn’t direct the conference itself, though. Maybe it’s different at the mega-conferences, but locally there was no indication that the sponsors were trying to capture the conference
And that’s exactly right. This is needlessly alarmist. A lot of accusations without any kind of macro analysis on what consequences this all has. It and also further doesn’t provide context of this versus government funding in academia, along with reasonable alternatives. It also doesn’t recognize the growth these fields have had, and how the landscape has changed with the scale.
Yep. I created a “hackathon” style event at a university for students pre-pandemic. The space was free and given by the university engineering college. For one day of 3 meals and t-shirts it was like $5,000. Partly because we had to go through the university catering service. And that was for about 60 students (we planned on I think around 90?). If we needed event space and the university charged us that would have been 3x at least. And this was run with nothing but volunteers. We had lots of corporate sponsors and honestly it was great. They mentored students, they set up career fair style booths and just had fun and shared good info about their companies. Without their partnership, frankly, a lot less would have been achieved. It was a win-win for everyone.
Oh god, University catering was the worst. They would happily charge fine dining prices for the most basic of meals, and would take forever to get back to you. I couldn't tell if we were getting the Fuck-Off rate or this was seriously what they charged everyone.
Yea… we got the discount rate and I appreciate the thoughts from the university but it was just way too expensive. Oh and they “couldn’t leave food out” so all of this coffee, breakfast bagels, fruit, etc. that we could have had available all day was gone after meal time because of some rule they had. I’m sure there were good intentions or there was a good reason but universities are far too inflexible when it comes to such things.
Furthermore, most of the people attending are being paid by their employers to do so. The number of tech-related events I would attend on my own dime and vacation could be counted on one hand with fingers to spare.
> Yes corporations understand conferences because there’s a benefit to being a sponsor/running one. One needn’t ignore conflicts of interests.
Well, there is kind of the point that they will only sponsor relevant events, there's not much point sponsoring an event outside their industry or target market.
Also in terms of regulated markets such as healthcare and finance there may be regulatory restrictions on marketing to consumers. Hence it would be easier to sponsor to an industry conference because you would be marketing to industry professionals, and in that context regulatory restrictions on marketing are typically more relaxed because it is implied that the professionals should be wise enough to know when things are suitable.
As for the specific case of doctors. As has already been said, they have a strict obligation to their patients first, a fact that is no doubt hammered into them during their many years at medical school.
Sure. And doctor's love to point this out. And yet time and again we've seen that recommendations are influenced by conferences and pharma reps visiting them. Sure, maybe some doctors remain immune but a non-trivial portion of them treat it like the business transaction it is.
Doctors are expected to make prescriptions based solely on the interests of their patients, and it’s a big deal if they don’t. There’s not really an analogous concern in software. By all means let’s be aware of the incentives, but if Google wants to give a bunch of developers a good time as part of a strategy to get them using Kubernetes, I don’t see an issue with that.
Expectation doesn't always match reality & there's plenty of studies that show that trade shows DO impact the recommendations they make to their patients. Doctor's aren't a special class of human that has figured out how to avoid the biases that can be exploited in every other human.
I don’t really have any insights to provide on “big companies are using sponsorships to whitewash their impact on society” but something I’ve seen is presentations and research often done in the context of the company’s unique circumstances that are not necessarily broadly applicable. It’s only natural of course that a company will optimize their research into things that are relevant to them but when you’re listening to them you need to keep “what is the context that this was written in” in the back of your mind at all times.
For example, if you ever attend a C++ talk by Googlers, you’ll notice that they basically only talk about C++ as they use it, silently ignoring things they don’t care about. By virtue of google3 and their style guide, things like ABI compatibility are of very little consequence to them, and they can take away expensive-to-support APIs and present about how they “optimized” some part of the STL, or how support for exceptions is something they’ll look into later or (when they’re feeling uncharitable) bad actually™. Similarly most talks about Linux networking are driven by e.g. Facebook, who seem to slowly just be converging on running their entire stack using eBPF in the kernel. Apple presents their ML research about on-device learning and somewhere in the middle they’ll be like “oh also we have specialized silicon to do this efficiently otherwise it isn’t practical”. Microsoft will present virtualization research and you’ll find that their threat model is trying to prevent people from jailbreaking the Xbox.
I’m not trying to say this research is bad or not useful, but it’s important to put them in the context of where they’re coming from or who they’re being funded by, because the entire thing-from the premise, to the execution, all the way to the conclusion-is going to be dependent on the circumstances the research was done in, and often it’ll be presented as a general result when it really only make sense in the context of that particular company’s needs. If you’re being cynical, it’s a way to appear open and exercise soft power through mindshare, but for most cases I think the alternative (no research) is probably worse so I’m not generally concerned.
> For example, if you ever attend a C++ talk by Googlers ... present about how they “optimized” some part of the STL
I'm guessing you're thinking about things like Swiss Tables. But the situation isn't that Swiss Tables "optimize" the STL's containers instead they're just the replacement you'd actually want. You can't "optimize" the STL containers, because they're defined in a way that's hostile to optimization.
Take std::unordered_set. Why is it so slow ? Well, your standard library is obliged to make this work roughly the same way as it would have when explained in a CS introductory Data Structures class in the 1980s. This is not necessary for the ordinary understanding or use of an "unordered set" which is why Swiss Tables has one that's much better, but if you've paid attention in that class you know there are buckets of keys with similar hash values so that's what the STL is obliged to provide.
If you just want an "unordered set" you do not want std::unordered_set despite the name, you want the much better replacements from Swiss Tables or various other offerings and it's unfortunate that std::unordered_set is in the standard and those are not.
Of course the other reason std::unordered_set is so slow for you isn't solved by Swiss Tables, but it was called out by the Googlers presenting Swiss Tables more than once. Your hash function is garbage. Even if you insist on using std::unordered_set because it was good enough thirty years ago or whatever, this part of the lesson is invaluable anyway.
When using data structures that are faster because of hashing, you defeat them by using poor quality hashes. To a first approximation if you aren't sure that you are using a good quality hash then you probably aren't. In any optimisation quest start by measuring, and in this case that means measuring: Is your hash actually any good at... hashing ?
To further support your point, the engineers at Facebook who worked on the F14 hashmaps/hashsets saw the same issues, and created two implementations as a result:
“Folly has chosen to expose a fast C++ class without reference stability as well as a slower C++ class that allocates each entry in a separate node. The node-based version is not fully standard compliant, but it is drop-in compatible with the standard version in all the real code we’ve seen.”
This is basically my point, though. The things you’ve said (which are basically just what Google pushes) are correct. For their use case, they’ve found some good wins and there’s lots of interesting things under the hood enabling this. That’s cool, but these improvements come at a cost: the ABI (and in some cases, the API) is different. For Google this is OK because they can just ask their clients to adapt. This might be fine for you as well. But it’s definitely not the case for everyone, and ignoring it (at best) or actively harping on the standard for making concessions for the “dumb” reason of stability is not appropriate.
I think you're arguing against a straw man? Maybe you can give a concrete example of the behaviour you're concerned about.
Neither of the Google C++ Swiss Table presentations I've seen were about how somehow std::unordered_set can be replaced by this completely different thing. The existing container is just useless, that's sad but there's nothing to be done about it because the ABI break cost is unacceptable.
Instead the talks were about why this thing (Swiss Tables) makes sense, how you can make use of it in your own software, and maybe about tricks you can use in your own data structures, or about how Hyrum's law interacts with this work.
Forget custom ML silicon, Apple also have custom Arm instructions that as far as I'm aware they still don't think we're of the sort to have earned the right to know about (officially).
Apple has an architectural license that allows them to build ARM-compatible processors with custom micro-architecture. I'm sure others like Google and Nvidia also have it.
> For example, if you ever attend a C++ talk by Googlers, you’ll notice that they basically only talk about C++ as they use it, silently ignoring things they don’t care about.
I sort of agree with whatever you just said. Perspective is very important as solutions emerge from the problems in play for these companies. I will probably extend this a bit further : these solutions also increase our understanding and mental models for better & secure products. What company X does is not only limited to X, but benefits others too.
For corporate sway in research - its my personal opinion (any only limited to me), that citizen awareness is generally high. HN and similar communities are quick to spot gaping holes or flaws, and alternatives are plenty as well. There is fortunately a still healthy ecosystem of indie developers who contribute everything from Linux kernel to iOS patches. As you mention, as of present this does not seem a big concern and the alternative scenario (no academia-industry symbiosis) could be worse.
Yeah, I definitely don't want to come across as asking for companies to stop publishing their research. The information is always interesting to see, and sometimes even trying to sus out the bias can help you better understand the companies themselves. Like, picking the example of C++, you can tell that Microsoft cares about ABI stability because they ship an OS that exposes APIs to binaries (Apple cares as well, but they're far less vocal about C++, but in the language they own–Swift–they've gone all the way to reifying an entire ABI stable interface for generics). The problem is when e.g. Google presents something about not caring about ABI stability, and whether unintentionally or not, recruits people to their cause, to the extent that I see Windows programmers who ship closed source software clamoring for Microsoft to "stop preventing C++ from being more efficient and better" because they read a bunch of stuff about how std::string could get a better layout or something. This definitely isn't wrong but the perspective is easily skewed by what your goals are, and it's easy to accidentally think Google's goals are the same as yours because they certainly have no incentive to suggest otherwise.
The question of how much we actually avoid this is a complicated one to answer. I like to think that a lot of the obvious biases get caught, but I have also been around long enough to know that Hacker News is definitely not immune to this. My employer constantly falls into the trap of having a problem and then looking around to see how FAANG is solving it, then trying that solution largely uncritically, despite not quite being a FAANG. It's mildly amusing when your see a principal engineer with several times more experience (and compensation!) than you do get tripped up by it, but it only emphasizes that evaluating research with a critical eye is difficult to do and everyone struggles with it to some extent.
> when your see a principal engineer with several times more experience (and compensation!) than you do get tripped up by it.
Slightly off topic :) I feel by the time people become Principals, they lose the laser sharp focus because they are juggling too many things at the same time. Principals who work as IC on the team, however are much better since they are hands-on to the current problems.
The basic premise that the author is expounding is "CS research will be guided by industry interests". This is good to most extent, save a few very remote situations. Conference funding in particular has a very tenuos relationship to what directions get a nod of approval in overall CS research.
CS (and EE) is a field which has seen great advances and adoptions due to the tighter integration between industry and academia. The tech has advanced much due to the fact that researchers manage to work on hard & real problems. Many of the avenues of research emerge & advance from industry adoption - e.g. bandwidth compression, codecs, recommender systems, crytocurrency etc.
In my observation, whenever there has been a significant push by a single corporate entity, it has seen a palpable pushback. One example that comes to my mind is Amazon engineering vs. Rust foundation. Also, if any company tries to cut some major research's lifeline, there is always a competition grabbing that opportunity to secure it to its own future advantages. There are no shortages of 500-pound gorillas when it comes to corporate sponsorship. Everyone wants a piece of the cake.
As mentioned, the only time this could suffer is when a company of the size of Google, AWS establishes complete monopoly on that research & ultimately sends to some academic graveyard, much to everyone's horror. But in my limited knowledge, that kind of black swan event doesn't seem to have ever happened.
Corporate sponsorship aren't inherently evil. University researchers get a taste of real world tech issues & companies advance their tech by sponsorship. Conferences also become a good hiring venue for jobs or internship. It's a win-win from what I see. Conference sponsorship is in no way going to change the fate of ongoing research.
I don't know about CS conferences but I've heard people complaining about free software conferences with a lot of Big Tech sponsoring where speakers were kindly requested by conference organizers not to talk too much about subjects "that may upset the sponsor" and things like that. If there's that kind of forces at play, then it is quite detrimental to receiving objective information.
I have attended a few PyCon, and fleetingly visited/watched several similar conferences. There seems to be enough instances where presenters have pressed companies like FAGMA to correct certain implementations in their product stack. Even CPP conference has on several times called out MS VC++ on their compiler peculiarities in the distant past. Whitehat security conferences are usually full blown on the offensive in showing how compromised some platforms could be. I am not discounting your concern, but unless evidence exists, this could be more hearsay or anecdotal.
(I dont work for any of these companies and on several occasions declined to interview. Full disclosure: no relationship).
It's hard to provide evidence for this type of thing, but it's very much a concern to be believed. Companies giving you a corporate sponsorship will definitely influence your decision making. Always. Choices are made, often literally behind the curtain, to appease them. Seeing a few public examples of push back doesn't eliminate this fact.
(Full disclosure: I am an indie conference maker [0], so I'm biased against corporate sponsors.)
> Whitehat security conferences are usually full blown on the offensive in showing how compromised some platforms could be.
Until recently it was common for sponsor companies to threaten conferences to pull talks that demonstrated security flaws in their products. They’ve realized that the PR hit from this usually outweighs any “benefits” now but it used to be a whole thing.
I'm honestly not sure that a conference stage is typically the most appropriate place to go after companies for specific issues. I can think of one of two examples where (to me) it was called for. But I certainly wouldn't want it to be the norm.
ADDED: What I mean by this is that a conference filled with talks where people are badmouthing competitors or others sounds pretty unpleasant.
You are saying that the risks of industry interests have not come true. There has been much talk about companies trying to influence what products are being used in education, which for universities, is closely related to research. Even if we assume this to be true a risk that has not happened (much) is by no means guaranteed to not happen in the near future.
Also the examples that you name "bandwidth compression, codecs, recommender systems, crytocurrency" are really quite sexy. They sound like things academics would research all on their own without corporate involvement. The thing is that basically these things can be researched if one just has a computer, or in the case of bandwidth compression, a few computers, and enormous amounts of money are not really necessary so the corporate involvement would not seem to be strictly necessary. And then, when a corporation is involved one runs the risk of the findings disappearing behind a copyright or a patent wall. It is much better for all of us if they become available to everybody. That is, by the way, one of the reasons to have academia in the first place.
> There has been much talk about companies trying to influence what products are being used in education, which for universities, is closely related to research.
I will speak for ML & Systems. I am not an expert on all domains. Most tools that researchers use in ML are open-sourced e.g. Pytorch TF Keras etc. Foundational papers like GFS, BigTable, Hadoop etc are publicly available now. There is a moratorium on how soon it appears, but it isn't behind walls forever. Academia tends to choose more open source over closed source. I'd have argued MATLAB to be more successful than Python in that case. It is not. In industry the practice may be more of 50-50 or even more towards closed source.
> They sound like things academics would research all on their own without corporate involvement. The thing is that basically these things can be researched if one just has a computer.
How can you emulate operation involving scale with just a computer or a bunch of computers. Hence, that is where industrial efforts come in play. A lot of things work well for a dozen, and then a completely different problem emerges when we talk about hundreds or thousands of users.
I am only alluding to the fact that CS, in contrast to many other disciplines, has a more symbiotic relationship with industry. Companies do have incentives of using copyright & trade secrets, but there is enough trickle-down effect that gives academia to pursue newer challenges. The cycle repeats over and over. Academia cannot replace industry and likewise. If there is any pressing problem of this symbiosis, it is more of labor attrition. More people are leaving academia for better pay. But that is not what the topic was about.
> "CS research will be guided by industry interests". This is good to most extent, save a few very remote situations.
You start out with a pretty strong statement there...
The point (for me) is not so much that a single company might try to push through some evil villainous plan. It's that all the companies that tend to sponsor such conferences (or more generally "guide" the research) have specific incentives.
Take as the most glaring example the way that machine learning and statistics have been developing over the last years. The industry has an interest in collecting and knowing as much about their customers as possible. Most prominently, facebook and google are both pretty openly based on surveilling every detail of their users' (and everyone else's) lives.
ML research has been co-developing with this. The big money (grants, hardware support, PhD funding, conferences, ...) has been overwhelmingly in domains that directly benefit these players. A lot of "cutting edge" research at the moment is of little benefit to anyone who is not a surveillance capitalist megacorp, simply because of the compute & datasets needed to power these methods.
"Causality" has been a big topic over the last years. And yes, it will benefit a lot of things. But where does the actual research start? With the question "why did the user click that search page ad, and what ad should we show them next?"
Sure, there is a little research into privacy preserving ML, into "small data" ML, into federated learning (i.e. user-centric ML, not "distributed training" as in spreading computation over a big corp's cluster) and you can always argue "yeah but in a few years this will be commodity."
That sounds like trickle down ML research to me. I'm not convinced. But you'd kinda have to make that case, because otherwise "this is good to most extent" doesn't seem so believable.
One big aspect of what industry-guided research has given people is all the burn-out, anxiety, sense of loss of agency, UI dark patterns, polarization, and dumbing down of the internet. Along some huge upsides, yes, but I wouldn't call that these are "a few very remote situations".
You have several fair points. The overall direction of course gets some incentives from industry. But there are government sponsorships & private fellowship too. ELLIS, DARPA, NSF, NIH invest several billion dollars each year to R1, CAREER, MRI, SURF programs which takes care of fledgling topics until they see more adoption. Simons Foundation e.g. similarly hosts several hundred researchers to work on CS theory.
Also Google and AWS in particular have put in a lot of money on ML/RL based solutions - on reducing electricity grid loads, Alphafold protein & drug discovery, neuroscience, precision agriculture, personalized education & even interplanetary science/astronomy. You could argue these could be glamorized CSR programs. But in net effects, they are advancing our understanding in several discipline which do not directly feed their bottomlines.
(Full disclosure again: I am not affiliated to any FAGMA or benefitted from any of these grants)
Seeing modern tech {e,in}volution, all on big of IT interests and almost zero for users and other not-so-big companies interests... I disagree.
Actual tech came from the big lab era starting from Xerox PARC, since them no real evolution was made, just improvements and new way to make people bound instead of being empowered by IT, IMVHO that's means just a thing: private-company made IT evolution is harmful for the society and then must be erased so badly that no one in the future will even think for an instance to try re-proposing it.
The solution IMO does not goes much through conferences but through universities that must be publicly founded, and ONLY founded by the public no to research "for business" but "for society", not to form "workers of the future" but "citizens of the future", doing so left a healthy business world and a healthy society.
This argument suggests that by sponsoring conferences, companies can shape their content. That may be true for big commercial exhibitions, but I don’t see how it works for academic meetings. For those meetings, the financial backing is confirmed years in advance, while the actual program is set six months or so before the event. And at CS like conferences, the content is determined by independent reviewers. So I can imagine that corporate sponsors might shape content by stopping support of conferences with low quality papers, but otherwise they have essentially no role in what is presented, other than providing some high profile keynote speakers that might increase conference visibility.
I came here to say the same thing. I've been on multiple program committees for multiple conferences and it would be laughable to claim sponsors have any influence over what gets published at a conference.
Sponsors have much more direct (and visible) influence over, you know, sponsorship of the research itself. But even there, corporate sponsorship is only one slice (and a relatively small one) of overall CS funding. There are plenty of government or independent sponsors who would be happy to fund research that is contrary to big corporate interests.
Yeah. It's truer of non-academic conferences which often even have sponsor slots. But even there, there's at least some effort not to have product pitches because if they tilt too far in that direction, people just won't attend. Conferences need to maintain some base level of quality/utility or they fade away.
> Industry is the main consumer of academic CS research, and 84% percent of CS professors receive at least some industry funding.
What's the percentage in other fields? I suspect chemical engineering is driven by the oil industry, pharmacy by drug companies. I imagine aerospace is heavily driven by very few companies.
Of course you could go get a liberal arts degree and be free from the tendrils of Big Dictionary but I think if we limit ourselves to STEM this is likely the standard. Is there an alternative? Or is my hypothesis wrong here?
> As governmental bodies rely on academics’ expert advice to shape policy regarding Artificial Intelligence, it is important that these academics not have conflicts of interests that may cloud or bias their judgement. Our work explores how Big Tech can actively distort the academic landscape to suit its needs. By comparing the well-studied actions of another industry (Big Tobacco) to the current actions of Big Tech we see similar strategies employed by both industries. These strategies enable either industry to sway and influence academic and public discourse. We examine the funding of academic research as a tool used by Big Tech to put forward a socially responsible public image, influence events hosted by and decisions made by funded universities, influence the research questions and plans of individual scientists, and discover receptive academics who can be leveraged. We demonstrate how Big Tech can affect academia from the institutional level down to individual researchers. Thus, we believe that it is vital, particularly for universities and other institutions of higher learning, to discuss the appropriateness and the tradeoffs of accepting funding from Big Tech, and what limitations or conditions should be put in place.
Is the capitalization “Big Tech” widely used or is it a shibboleth? I would guess that I can anticipate the tone of the paper based on that - I’m curious if that holds up.
(Of course, mentioning “Big Tobacco” does give its own hint.)
This article bases a large portion of its incentive for being written on the claim that "84% percent of CS professors receive at least some industry funding". To do this it cites a paper which, as far as I can tell, says no such thing. There are three problems with the claim and I think they ruin the rest of the article.
First, the original paper only looked at UT, MIT, Stanford, and Berkeley. But a fair bit of industry funding is an exercise in prestige sharing: "I'm funding a professor at MIT". As a result, in my experience the top universities receive the lion's share of tech industry funding, and this very severely biases this claim.
Second, the paper compiled faculty who ever received funding over the course of their career, no matter how small: but the article doesn't say "received": it says "receive". That is a huge difference. Looking back on my own career, I guess I'm in the list: once a VP of a local company chipped in maybe $15K to help my advisor fund me for my last PhD year simply because he was excited by the research work; and I think once Google funded some undergraduate students of mine working on RoboCup. I realize some faculty are funded more by industry: but I think my situation is typical.
Third, because it is often meant for prestige rather than quid pro quo, and because Google and friends don't like paying overhead/indirect, the actual size of funding from industry tends to be very small and in the form of gifts. Not always, but usually. While Google might pay something like $30K to run a new program for Diversity in CS, DARPA is awarding a grant for $1.5 million to build a new multirobot architecture. The total industry funding of computer science, as a proportion of total funding, is probably somewhere south of 5%. I'd guess AI is about the same, but let's say 10% to be generous. See the following NSF graph. https://media.nature.com/lw800/magazine-assets/d41586-019-01...
So what are we left with? I don't think the article can claim that industry is of any significant consequence in CS academic funding. At most we could say that the industry may have funded faculty over the course of their careers, in some context, often minor and with a bias towards prestige universities. That seems to be a pretty weak hook to hang one's hat on.
The article seems like it's conflating philosophy/social science analysis of the impact of technology with academic computer science. Would papers on "the social justice impact of the atom bomb" be reasonable physics conference topics? I... really don't think so. That's social science. I'm not saying it shouldn't be talked about, but it seems pretty weird to complain that hard science conferences don't have a lot of social science topics, and then blame that on capitalism. Academic computer scientists or physicists aren't even likely the best people to be researching that! I don't think the physicists in the USSR were primary attendees of social science or philosophy conferences either.
Aside from that — and I suppose this part is capitalism's fault — where do they propose the money come from, if not the industry benefitting the most from the research? Big Tech has a lot of money, so conferences and research they sponsor gets a lot of funding. Banning tech companies, or restricting them, from funding research doesn't magically make other research get funded. If there was anywhere close to the ability to get funding from other sources, the article wouldn't exist. But that's where the money is now, and absent some societal upheaval and replacing capitalism with... ?... that's where you can get the money from.
Yes yes government research projects, but realistically those were all DARPA projects, and having the military be the primary funding source for research isn't exactly getting rid of conflicts of interest when it comes to determining social impact of said research.
Go to smaller independent conferences and make sure they're organised
along ecological and ethically sound lines.
First you'll have a better experience. Since the pandemic I haven't
been on any academic jollies, but for years I've had a personal policy
not to go to large conferences. They are too hectic. Everything is
rushed and talks are squeezed. They value form over function and slick
presentation uber alles. No time for slow conversation, precarious
demos, mooching and mingling.
There are two kinds of conferences, the ones where people go to
actually confer, and massive industry extravaganzas that are
indistinguishable from trade shows. A friend who is a medical doctor
once told me of a endocrine conference she attended, 5000 miles away
in Africa, with 10,000 attendees that ran for 12 days. She said it
left her drained, bewildered and overwhelmed.
People don't attend conferences like that to learn anything, they go
to be seen, as a footnote in the proceedings, so they can put it on
their CV.
As a seasoned, senior academic secure in myself and my work I don't
need to chalk up creds like fresh post-docs, but my advice to anyone
is don't be cowed into thinking big impersonal conferences are
"prestigious". Really no one cares. Pick a small gathering, somewhere
nice (by a beach, forest or mountain venue), not too distant, where
you can mingle and make acquaintances that will last for life.
Truth is, many of the most _influential_ "conferences" are really
cliquey gatherings where you will meet the "right" people.
So get involved in the conference organisation. See who you can
invite. That reduces costs, and it gives you some rights/leverage to
say what you feel, and to understand the network in your field.
If you object to Big Tech being sponsors say so - conference
organisers need _people_ as much as they need funding.
If you must attend a giant extravaganza, and you seriously, credibly
object to funding by big tech, say so in your presentation. There is
little or no comeback from politely dissing unethical big corps in
your talk - in fact in some places it's de rigueur. Your disapproval
ends up in the proceedings online etc, and that's not something
company PR likes. Next year they won't offer mission accomplished.
As an academic you have a platform, and rights to speak your mind, so
use it. Ethics is a very important part of research and you are not
restricted in speaking about it. Best of all, start hosting your own
small conferences. It's a great learning experience. Many great
conferences are hosted on a shoestring.
It’s really shitty to use the acronym “FAGMA” instead of “FAANG”. It’s clear homophobia and sucks to read. (I assume it’s a joke because (1) “FAMGA” works just as well and has no slur and (2) quick search for it yields very few results).
Hopefully someone will take it down for this or other reasons.
I completely agree with the concern over CS research being too heavily influenced by industry. An area I feel is underresearched is social media moderation, especially in the more open systems like reddit where anyone can moderate.
There has been no research that I've found that looks at the impact of Reddit's style of comment moderation: by default, authors are not notified of removals and the content is presented to them as if it's still live [1]. You can try and see it yourself here [2].
Some people in-the-know may brush off this behavior, assuming it wards off bad actors. However, bad actors have access to the same tool. The question is, what is the result of that?
That would be amazing if those fields would dive in but I don't think that's happened yet. Such researchers would need to pair with a coder. FWIW, this is the result of a google scholar search for "reddit moderation":
> Such researchers would need to pair with a coder.
To set up the code for the experiment, maybe, but that could be done by even a second-year CS student. A CS researcher wouldn't be needed at all to do this research. It's strictly a social science question.
I completely agree. Nonetheless, the research I've seen up to now has come from computing labs, and I was speculating that access or familiarity with the data may contribute to that.
Just curious: What is the relationship between Reddit moderation and conference funding. I seem to have missed your point. Could you please elaborate further?
The article suggests conference funding influences what gets researched, which is the wider concern. The link is between conference funding and research. Reddit moderation is an example of something I feel is underresearched.
Thanks for your question, I edited my original comment to hopefully be more clear on this.
There are enough significant impact conferences like FOCS or the TCS+ symposium, which have no/minimal corporate funding. In the initial years, even ICLR didn't have corporate sponsorship. That actually does not stop people from probing pressing issues.
Reddit censorship may not be a great simile of corporate funding. Maybe even anecdotal. There are more academics on Twitter, and I havent seen evidence of censorship. (I have active accounts in both of them and the difference is quite visible of engagement in Twitter)
> There are enough significant impact conferences like FOCS or the TCS+ symposium, which have no/minimal corporate funding. In the initial years, even ICLR didn't have corporate sponsorship. That actually does not stop people from probing pressing issues.
Sorry if I wrote this in the wrong place. IMO it is a pressing issue that has not been researched. The assumption among many is there is nothing bad happening.
> I havent seen evidence of censorship.
The site in my profile can yield examples under "How do people react?"
Speaking from experience. Even a decent sized "basic" conference (i.e. no fancy stage setups etc.) will still cost you an arm and a leg.
You need to:
So you might say, well, how about going 100% digital. Well, trust me, the good platforms know they are good and they charge accordingly.Ok you might say, "well, we'll charge registration fees". Well sure you might, and sure that might well cover 100% of your costs. But have you ever seen how registration goes for a conference ? It takes time for the numbers to ramp up. In the mean time, you need money in the bank to pay for stuff you need to pay for "now". And you need money in the bank as security for the contracts you'll be signing with the venues (if nobody turns up or fewer people than expected, the venue will still want some money off you).
So, its then a question of where the money comes from. And like it or not, corporate sponsorship is typically the easiest way. The corporate structure "understands" what a conference is, so you won't get bogged down in discussions. The corporate way is also the easiest way to get nice big chunks of cash instead of having to beg tens or hundreds of different people.