The Illustrated Guide to a PhD
matt.might.net365 points by chii 19 hours ago
365 points by chii 19 hours ago
Original author of the guide here. Wonderful to see these little illustrations still making the rounds. I first published them in 2010!
To those in the comments who mentioned you are just starting your own PhD: Good luck to you! And, I hope you, like I once did, find a problem that you can fall in love with for a few years.
To those just finished: Congratulations! Don’t forget to keep pushing!
To those many years out: You have to keep pushing too, but there can be tremendous value in starting all over again by pushing in a different direction. You have no idea what you may find between the tips of two fields.
Any advice for PhD dropouts? I spent years and years pushing against that boundary in an obscure corner of my field and it never moved. What little funding I had dried up and I left grad school with a half finished dissertation, no PhD, and giant pile of broken dreams.
I'm sure over the years you've known students who have started a PhD and not finished. What (if anything) have you said to them? Do you feel their efforts had any value?
I'll give you advice. Success in pursuing a PhD isn’t just about the discipline or the degree—it’s about finding the right environment to support you. If earning your PhD is still a dream, focus on identifying a program that aligns with your needs and strengths. Look for a school with the right resources, a program that’s well-structured, and, most importantly, a supportive advisor who believes in your potential. Combined with your dedication and passion, these factors can make all the difference in achieving your goal. Don’t lose heart—sometimes, the right opportunity can change everything.
Disclaimer: I have no idea what I'm talking about. I've never participated in a graduate program.
I'm a PhD dropout myself. Serious question: what kind of advice are you looking for exactly? This is not intended as an insult, but it sounds like what you're looking for is not advice but rather consolation, which is natural and understandable given the circumstances.
Reading the original post led me to this article on your site: https://matt.might.net/articles/my-sons-killer/#full
This is just to say I found it incredibly compelling and moving; I hope mentioning it doesn't make you feel bad.
Nothing to feel bad about. Thank you for sharing that too.
My son’s life changed my own in profound ways, and even though he died four years ago, he is still changing my life in profound ways. I am always grateful for the reminder and to reconnect with the purpose that his life gave to mine.
That post also reminds me that while he was alive, I did the best I could for him under my abilities, and that’s all any parent can do in the end.
If you want to know more about his life, I wrote on it here: https://bertrand.might.net/
>>> but there can be tremendous value in starting all over again by pushing in a different direction.
This rings true for me at this time. Done about 10 years now, never went into academia but direct into industry. Things seem a bit stale, maybe its time to pick and research something new. I've been hesitating on the "going back to school" thing. But quantum does show promise, for curiosity and potential rather than immediate impact.
Matt thanks for the encouraging words... enjoyed your compiler class and sad that you didn't end up in my PhD committee... done 3 years now but stuck lol.
Thank you, Matt! Loved your guide when I started my own PhD in ye olden days, and I've shown it to a lot of people since then.
Yes, I can attest that nowadays, in some fields, research has become a 'game', where:
- people torture data until it yields unreproducible results;
- people choose venues that maximise their chances of getting published (and pay for publication sometimes, I'm looking at you, APC);
- little concern given to excellence, rigour, and impact;
- the chase for a 'diploma' from a renowned institute without putting the effort;
I could go on and on, but I'll stop now.
Perhaps something changes, I am waiting for this to happen for some time now (10y and counting).
It's a bad system but that's what we have (at the moment).
I’ve seen this with a PhD student publishing several rapid fire papers in MDPI journals. They are repeating well understood physics work done 50 years ago using off the shelf commercial simulation software. They don’t cite any papers older than a decade and claim without irony that the work is “significant” while none of their papers are cited. They will go to events where no one is an expert in the field and win prizes for showing lots of pretty pictures but nothing that isn’t already well understood.
When I, an expert in the field, tell them they need to produce something novel at their research panels I’m told I’m wrong. When I list all the work they are ripping off I’m told it’s somehow different without explanation. When I question the obvious sloppiness in their work (the simulation data showing major artefacts) they blow up at me screaming and shouting.
I’ve never experienced arrogance like this before. It’s shocking. Their supervisors tell me that they are close to firing them but then also celebrate all the publications they are getting.
The mind boggles.
> When I, an expert in the field, tell them they need to produce something novel at their research panels I’m told I’m wrong. When I list all the work they are ripping off I’m told it’s somehow different without explanation. When I question the obvious sloppiness in their work (the simulation data showing major artefacts) they blow up at me screaming and shouting.
At risk of relying totally on assumptions, that wouldn't be a surprising reaction for someone facing first serious criticism after an entire life of probably being unconditionally lauded for their smarts (or the projection of it). When parents push children towards something relentlessly without providing any constructive feedback on account of living their dreams through their children and/or the fear of discouraging the child, any criticism can feel like someone is trying to destroy your life goals.
> Their supervisors tell me that they are close to firing them but then also celebrate all the publications they are getting.
Probably trying to protect themselves from being in the crosshairs of one of many things that can blow your career apart.
This individual is pretty unique in this regard. I’ve never seen anything like it. Most students will acknowledge that I know the literature and will accept guidance. This person seems to think they know everything but their work is the equivalent of a tutorial case in the commercial software.
I've seen it often as I have had to read peoples thesis when interviewing for job roles.
Some from reputable universities. I have no idea how they defended them.
I've experienced this in the corporate world too, when someone is seeking a promotion. Entitlement is becoming a bane
Ok to be fair the original is probably a badly scanned tech report from GE from the 70’s with minimal implementation details. Whoever has tried to implement an obscure physics paper from that age knows how tough it can be.
I think there is value revisiting some of this work with our modern toolsets and publishing the code in some public repository.
But of course with a clear citation chain, and no pompous lies that a new discovery was made.
It’s a really basic engineering problem that was studied extensively in many studies and we teach it at undergraduate.
When I made the point that there is no scientific novelty here they insisted that their PhD was a ‘generic’ one and that means they can continue to run basic simulations according the to the recipe.
This isn't new, and academia has been rewarding behavior that wouldn't survive elsewhere for a long time.
Maybe it's time we unshackled ourselves from these 'prestigious institutions'?
Your anti intellectual bias is showing. There are problems in all domains. I’ve seen plenty of arrogant fools in industry too.
They’re using an industry tool to do well trodden industry problems that were solved by academics decades ago.
I’m not tolerating his behaviour and I’ve made my views clear to my colleagues. He’s going to burn every bridge possible with this behaviour.
Criticism of university organizations is “anti intellectual bias”?
Do you think criticism of religious organizations is “against god”?
I'm all in favor of intellectuals, it's academic bureaucracies I'm not fond of.
I think Socrates was a hoot, and he taught in a cave or something like that.
Priests teaching rural peasants to read in their monasteries, and collegial colleges for the public benefit are definitely meritorious.
But,I mean, there is enormous corruption going on.
How did Ren Youzai get into MIT? He was a body guard. Just because you've married into a billionaire's family MIT says "hey, send anyone you want in"?
And I'm sure MIT isn't alone in mysteriously average students who not only get in but graduate when linked to massively rich and powerful families. A recent US president comes to mind. Is that anti-intellectual?
> I think Socrates was a hoot, and he taught in a cave or something like that.
I'm not sure whether you're joking or serious, but in any case, Socrates didn't teach in a cave, and you're probably referring to Plato's allegory of the cave.
The interlocuters and followers of Socrates were mostly the wealthy elite of Athens.
@lapcat yeah, he argued in the markets or where ever.
I think I was mixing him up with Aristotle, e.g. https://www.ancient-origins.net/history-famous-people/caves-...
who had some cave school or something.
but there was some jokingness, yah. But I'm not anti-intellectual, which wasn't a joke.
I wasn't making any point about his students or wealth. Education then, as now, is the plaything of the wealthy and wealthy nations.
You’re arguing about highly specific cases while the vast majority of institutions get on with the job of educating large numbers of students and doing what research they can.
The highly specific cases are glaring examples that the unbiased meritocracy they pretend to be is, possibly, not so.
And the "large numbers of students" covers up the possible cronyism and/or corruption of the institution.
I provide an example of a totally unqualified individual being allowed into a prestigious institution solely on the basis of his marriage family. Your response is that they mostly do a good job for most people?
I've suggested that the research they do is not obviously beneficial to anyone except perhaps the person doing the research, possibly simply to advance their own careers (in or out of academia). Others have suggested the same.
You haven't disagreed.
Ugh you’re a tiresome culture war poster. I’ve no Interest in you or your hobby horse.
> - people choose venues that maximises their chances of getting published (and pay for publication sometimes, I'm looking at you, APC); > > - little concern given to excellence, rigour, and impact;
It's because the kpis of assessment are built like this. Goodhard's law. I know lots of good researchers who get frustrated with the system and end up giving up and faltering to those 2 points. If within a uni 2 research groups are putting out research at different rate at different quality, the one with higher quality, lower frequency, and higher standard and ambition gets heavily penalized. Seen it in action.
Yep. I also know researchers who refuse to play this game, but their career plateau'd and they have to work with little fundings.
Yes, in the STEM fields, for published papers, it's easy just to count them; much harder to read them; to evaluate them, some people just count awards, etc. So, the hard work that makes the good material in the paper may never be noticed.
There is, "You get what you pay for." So, want papers, you will get papers, and you can count them. It goes, did Haydn write 101 symphonies or 1 symphony 101 times?
Early on, had a good career going in computing but where occasionally some math made a lot of difference. So, to help that career got a Ph.D. in applied math. Never had any intention of being a professor but for a while did to try to help my seriously, fatally, stressed out wife (Valedictorian, PBK, Woodrow Wilson, NSF, Summa Cum Laude, Ph.D.) -- took a professor job near her family home and farm.
In my little Ph.D department, saw the Chair and four professors get fired and one more leave, fired or not. The career I had before grad school was a lot better than the one those professors had.
Had to conclude that, tenure or not, being a professor is, on average, a poor way to even reasonable financial security. Generally there is low pay, e.g., too little to buy a house, keep cars running, support a wife and family. There's a LOT of dirty politics, infighting, higher-ups who don't want you to be successful.
Bluntly, a research university takes in money that a lot of people care about and puts out papers that only a few people care about: Net, there is no very strong reason to pay professors enough for even reasonable financial security. Key sources of the money are short term grants from the usual suspects, NSF, DOE, DARPA, NIH ("too many for them all to be turned off at once" -- JB Conant?), but that is essentially just contract work and not steady employment, with little promise that when a Professor's baby is ready for college there will be money enough for them to go. It's a house built on paper that can be blown away by any thunder storm.
Now, for a career, e.g., financial security, to leave something for the kids in the family tree, regard business, e.g., now involving the Internet, as the best approach, and there regard computing and math as important tools but only tools. Research? Did some, and it is a key to the business. Academic research? Did some, published, on my own dime, still waiting for the checks.
History, how'd we get here? Used to be that some guy built a valuable business and had several sons. One of the sons inherited the business, and the rest went to the military as an officer, academics as a professor, or to politics. Then WWII showed that the STEM fields can be crucial for national security, and some related funding started, e.g., summer math programs for selected high school students, research grants.
"If you are so smart, why aren't you rich?"
Ah, "The business of America is business."
Chiming in with my own experience. I was with a new PI that was a charlatan; I am grateful for the experience because I now know how to recognize these sorts of people and avoid them.
"There is no journal of negative results." he would say at our weekly meetings. In order to secure his future, he set ablaze the dreams of 5 PhDs in my lab (all of which took their masters and went into industry; One developed severe OCD). Data was massaged, lies were told to his bosses.
Guess what? He's still a professor there, his lab still publishes dubious, unreproducible research. No recourse was to be had at the university (all of the PhDs went to the head of the department and were told to f*ck off).
Academia is on a death spiral at many schools, and I worry that it's up to the industry to carry the torch of research in the future.
Any evidence that this approach works? Are people who do this able to move from the PhD to a solid position afterward that they could not have had without the PhD?
> Perhaps something changes, I am waiting for this to happen for some time now (10y and counting).
What will change is that PhD will become an inherited title. If your parents were/are PhDs, you will ceremoniously be granted the title when coming of age. That title can then be rented out to people or organizations (such as companies) who are required to have a PhD by government regulation for the activity that they are in. You can of course also mortgage this title to a bank or other company that will take care of the process.
Yes it's horrifying indeed. It wouldn't be a stretch to say that there are many that fall into this rat race and treat it as another type of MSc credential.
I'm starting a PhD — essentially from tomorrow. It's a shame to see so much discouragement here, but at this point I'm no longer surprised. I also don't care because if left to my own devices I would do research anyway.
In the kindest possible way: screw all of you!
> if left to my own devices I would do research anyway.
Then you're going to have a great time during your PhD, good luck and have fun!
> screw all of you!
"Disregard!" https://stepsandleaps.wordpress.com/2017/10/17/feynmans-brea...
Most of these people have never been in a PhD program, so take their words with a grain of salt.
If you have a good advisor, your passionate about your project, and you got some good funding, you'll have a wonderful time of exploring interesting ideas and becoming a competent researcher. Good luck!
All that’s true, and that is completely different than what’s advertised in this article.
I did my PhD (mapping and studying the physics of caves in the ice on an antarctic volcano) purely for fun, and it was awesome.
I hope you have a similarly rewarding experience. You will encounter unfair systems and unscrupulous people, and it will be frustrating. The data will be confusing as hell. My only advice is stay true to yourself. Maybe look into some of the new trends that could fix academia -- pre-registration, open access with public comment periods, reproducible code, etc. For inspiration, I cheer for crusaders like Data Colada who are trying to save the academic system.
That is the way. Not long ago, most of Science was done by rich people as a hobby. We need to get back to that system.
Please consider that not everyone has the luxury of having (had) a PhD advisor who really cares. There's a wide spectrum, ranging from micromanagers, to people you see once during your PhD, to advisors that are genuinely great (intellectually and as a person) and caring.
I wish you the best of luck for your PhD, a caring and supportive advisor, and great results!
I don't think there's any reason to be discouraged. There's a lot of bias against PhDs for various reasons (good and bad).
I have a PhD, got an academic position and then worked in various companies (startups, big tech company). These paths aren't exclusive.
I'm glad I did the PhD.
- it gave me time to work on a variety of interesting topics. In my company, I always feel rushed and don't have time to learn as much as I'd like to.
- I had more than one career. Working only in industry after graduation would have been pretty sad I think. Not that it's bad but it's great to see something different
- I developed some skills (for instance talking in front of audience, write scientific papers) and got to meet a lot of interesting people, and worked in different countries.
I also learned that research wasn't for me but it was worth doing the PhD anyway. If I had to do it again, I would pick my topic more carefully, and go straight to industry rather than pursuing an academic position (which I actually didn't like). Also money wise, even though I'm not materialistic, the pay was too low. Certainly enough to live, but not enough to secure my future and retirement.
I also started my PhD last week and honestly from my talks with the people there thus far I'm much more optimistic than the general HN view of PhDs. You still have to be realistic however. Best of luck!
Great! The view of almost anything hard is gloomy online, probably because the conversation is dominated by those who either wish they'd tried and now have a chip on their shoulder or who made the wrong choice for them and are self-therapising by writing about it. Those who thrived in a PhD programme likely don't have a reason to bang on about their experience in quite so many words.
> You still have to be realistic
I'm expecting it to be very challenging. But that's the point — isn't it?
Good luck to you too.
Disregard them. A lot of people fixate on the 1 in a billion celebrity exceptions like Musk, Thiel, Gates, Dyson, et al and go “look look you don’t need a PhD!”
Yes, a highly motivated college dropout with a computer, a strong financial safety net, and the right social connections can be in the right place at the right time to seize big opportunities. Most people are not in that position. Many high-impact technologies need more than what just a computer can do.
The main thing is to be self aware enough to know the path you’re on, what paths are available to you, and how to make the most of the connections and resources you have available to you. The second you start to get pigeonholed, wrap things up and move on.
> The second you start to get pigeonholed...
That seems like good advice.
Yes! Be very aware of your time and opportunity costs. It can be an amazing journey, struggles and all, but make sure to not get stuck long-hauling on something you’re not passionate about.
This is the best possible answer to Internet-based negativity! I wish you luck in your PhD :)
"PhD Envy" is a real part of office politics outside of academia. Remember that the naysayers are just jockeying for their own status. On the one hand, you can ignore it. On the other, learning to manage it is a good starting point to navigating the social and political side of any career.
Also, this is HN, which revolves around an occupation -- computer programming -- that is unique in terms of having high demand while remaining flexible about how and where people learn their skills. Not all fields are that way.
I got a PhD in physics, in 1993, and have worked in industry since then. There are a couple of "negatives" that I still think are wroth pointing out:
1. PhD programs have very high attrition, and you bear most of the risk on your own shoulders. It's worth going in with eyes open, and knowing the risks. Getting out with your PhD may require some compromises along the way. I won't necessarily call them ethical compromises, but perhaps compromises to the (typically) idealistic views that many students start out with.
2. The little nub of specialized knowledge shown in TFA is your research, not your brain. You can do specialized research without becoming a specialized person if you want. This is a personal choice (academic freedom and all that). My dad, who also had a PhD and a good industry career, always told me to avoid hyper-specialization.
Don't forget to learn how to code, just in case. ;-)
> PhD programs have very high attrition, and you bear most of the risk on your own shoulders.
How many people do you know who “failed to meet the standard”? Zero. If you do the time and work for your professors you will get the reward. There is no risk.
> PhD Envy" is a real part of office politics
The most vocal critics are not bachelor degree holders, but those who did it and had a bad experience.
>>> If you do the time and work for your professors you will get the reward. There is no risk.
1. Your experiment fails to produce a result after a few years of effort (my project, we don't know to this day what went wrong, and I was lucky to find a new project).
2. Loss of funding or institutional support. (A large program at my state's university pulled its support for a process that required regulatory approval, and an entire group of faculty and students all had to leave.)
3. Your advisor quits, changes jobs, gets fired, goes to prison, dies. (Many cases).
4. Your advisor holds your thesis hostage until you publish a certain number of articles (a friend of mine, she sued and won).
5. Mental health issues (high incidence of clinical depression).
6. Personal animosity between members of your committee (another friend).
How these risks instantiate themselves is that you have to start from scratch, often with a completely new research project, and finding one isn't guaranteed by your department. You are almost completely at the mercy of one person -- your advisor. There is virtually no oversight.
I agree, those are all real, especially the advisor and committee.
Most of these are factors in any employment, and I would argue things like chance of losing funding at your job is worse than academic funding threats.
I wouldn't say it's about "failing to meet the standard". Sure, there is no exit exam you can fail, but there are still people dropping out of a PhD.
It could be because you realize you don't really like research - that involves reading and writing a lot of papers, going to conferences not just tinkering. It could be because you had the wrong professor who failed to lead you and left you by yourself. It could be because you gave up at a low point, where most PhD student go through. It could be because after 4 or 5 years your professor keep saying "you're not ready yet" (I've seen that in humanities).
So it's not really a problem of "not being good enough", but it definitely happens.
Definitely true. Attrition is real, but "risk" is probably the wrong term. It's one of the lowest risk options available for young people.
I've worked with PhDs all my life in various industries and it has always been a fun and enjoyable partnership. Have fun.
@xanderlewis I don't think anyone's comments are meant for you specifically, nor to discourage you or anyone particularly.
If anything, take it all with a "grain of salt" and reflect on whether or not anyone you meet might resemble these comments. Hopefully, not your future self.
Good luck in your career.
Care to share any details? What country are you studying in, and what's the subject area?
UK; more specifically Scotland. And mathematics; more specifically (algebraic) topology and (differential) geometry.
From what I've read, graduate study on your side of the pond is a lot better than in the US. I really can't say why. The people I know who got their PhDs in Scotland were, for one thing, really sharp. That helps. For another, it seems there's more of an expectation of a difficult but manageable workload and risk level. Maybe more focus on research and less on politics. Of course, a better safety net, meaning less pressure to drop out if something happens like your spouse or child needs medical care.
Every country is different.
The nice thing about mathematics is that there probably won't be any failed or non-reproducible experiments in the lab. That doesn't mean that a math PhD is going to be easy, but you should be aware that a lot of people will have a different idea of what you are doing if you don't tell them that your PhD is in math.
Best of luck for your PhD! You might want to check out this ted talk: https://www.ted.com/talks/uri_alon_why_science_demands_a_lea...
Thanks! I hope so. Experiments (to the extent that there are experiments in mathematics, which arguably there very much are) often fail, but once they succeed they're usually fairly bulletproof, and reproducibility is barely even a concept.
If time and money weren't concerns, I would love to do mathematics research!
I hope you achieve good things, and have fun while at it!
Wouldn't we all? I'm very grateful to live in a world (and time) where such opportunities exist.
There is literally an ocean of difference between mathematics and ML (which seems to be what a lot of comments are talking about).
Yes. Which is why I'm trying to push back a bit and say 'hang on... none of this is intrinsic to the PhD system'. Of course it's true that some PhDs — hell, some disciplines — are built to an embarrassing degree on BS and academic schmoozery, but there's no need to tar everyone with the same brush. It seems as though some commenters have difficulty conceiving of intellectual pursuits that don't involve 'data' and 'graphs'.
I'm only very junior, though, so I don't have total confidence that I'm right. But I'm pretty certain I am.
I hope you have decent source of secondary income or your family is reasonably well off.
A math PhD might take 6-7 years to complete and I hope that, at the end of it all, you won’t have to come to London to look for C++ or Ocaml jobs at hedge funds or banks.
I'm in the UK. It takes nominally three years here; usually three and a half. I also have full funding.
...this is the discouraging negativity I'm talking about. I do, respectfully, wonder what your agenda is.
Because i have worked with many math phds who lost their youth to something that they could not make a living on (research positions at universities are few and the competition is intense) and were writing C++ implementations of derivatives pricing models for a (comfortable) living.
I am not trying to discourage you, just a different perspective.
Well, luckily I'm not doing this in the hope of increasing my earning potential. It's an entirely separate pursuit. I have no doubt that what you're saying is true, but I don't think I'm bothered by it since it's not my goal.
Good luck with your PhD. Stay focused and enjoy this time, you will have a freedom to explore and study that’s almost unparalleled in post-grad life!
PhD now is an expensive way to signal that you can persevere longer than the average human.
Unfortunately, that is not always a positive as many real life situations require you to make decisions under extreme paucity of information and reverse or change course at short notice. For such professions and roles it is a liability.
I can see why you might think that; in some cases I'd agree. But there are parts of science — indeed of human knowledge in general — that are very difficult to break into if you don't have the opportunity a PhD affords. These disciplines also require perseverance longer than the average human. Without this system, we're not going to make fundamental progress.
I'm pretty sure that without the research done by people with PhDs and people who don't give up at the first hurdle, we wouldn't be able to be sitting at our keyboards now having this conversation. Of course, it's not for everyone. Maybe it's not for most. But I don't think you should write all of it off as 'signalling'. Some research simply cannot be done without several years of focus, outside of industry or 'the real world'.
I just left academia (after one three-year postdoc). Good riddance. For myself at least. I do think one can thrive in it, if you are a good salesman, don’t mind sucking up to those with money with lies and exaggerations, and don’t mind isolation
What kind of area were you in?
It's hard to believe this 'sucking up to those with money' thing applies everywhere, though it's easier to imagine it applies in certain domains.
I did a PhD and about a decade in postdoc/early-career researcher posts before moving into tech. That was in Computational Mathematics, it was clear that the most successful people in the field were the ones who had found areas where they could publish "Technique X applied to field Y" type papers, so for each new X they could get 10 publications (by way of 10 different PhD students). These people generally could steer the core funding in the discipline their way.
Everyone else basically had to reformulate their research to pretend it was applicable to the government's funding subject de jure. This led to some quite large stretches in definition to achieve "<Main area of research>, and some applications in <funding stream>". This very much felt like it was sucking up to money.
I got out of the academia in the end because it felt like the more senior I got the more time I spent applying for funding and managing the spending and the less time I spent doing research/development. (Also given I was in a UK public sector institute, the pay was shit due to 40 years of below inflation pay rises crippling the institution).
Yeah that’s the thing. Not only do you have to beg for money but actually you make fake papers to do so. My supervisor taught me this early on, every single tiny discovery or synthesis can be made into a paper even if it really doesn’t warrant it
I left because the only path forward here in Germany is to become a professor, aka a life full of admin and sales
Why would a field be immune to political patterns found in every organization?
'Immune' might be too strong — we are all humans after all. But it's certainly plausible that the magnitude of the effect varies between disciplines.
As the commenter above observes, physics is (supposed to be) falsifiable, so it should be clear when you have a result and when you don't. In the some of the more 'wooly' disciplines, this is not the case. You can write BS and as long as you're able to argue sufficiently eloquently that your particular strain of BS is valid, you win — in some cases, you needn't even supply data or perform experiments. It is in those fields that I assume the forces of politics/fashion/social pressure are strongest.
Quantum physics
Could you be more specific about the pressure you felt to 'suck up' and 'lie'? I read this kind of thing often, but it's usually left quite vague. What exactly are people lying about?
Physics (since it's supposed to be rigorous) seems like a less likely area than some to be driven by politics and trends, but I suppose I can imagine that competing research programmes and ideas benefit from a certain amount of marketing and smooth-talking of people with funding rather than relying purely on empirical evidence for their claims.
Physics absolutely is dominated by politics and trends. I was constantly expected to over exaggerate the impacts of my work and its possible applications, on both science in general and wider technology. For example, we used to always say that it had some impact in quantum computing even if it was a total lie, because that makes it way easier to get funding
Physics may be in some sense more falsifiable, but it is absolutely subject to politics and social norms, both in how it lies about itself for money, and literally in which theories are chosen (since we can rarely empirically distinguish between them)
Right. That's a shame, but it seems like that's just life. The idea that it helps to constantly emphasise (or, as you say, exaggerate) the importance of your work if you want to have a career is certainly not restricted to academia. It seems to apply to most industry engineering jobs too — from what I've heard. I guess in that context funding isn't the issue, but being promoted (and, conversely, not being sacked) certainly is.
Well in most engineering there is a market and going out into the market and selling is not the job of the engineers
Academia is rare for having the engineer also be the salesman
My wife is the lead mechanical engineer at a small company and she definitely doesn’t have to go around convincing customers they need her products
It’s a nice idea that you’re going to help the boundary of human knowledge expand but I don’t think infinite progress is the right model.
All the evidence shows that fields are completely ignorant of each other and reinvent the basic solutions. This coincides with the theory that cohorts of experts develop expertise which is not transferrable.
Watch as ML rediscovers harmonic analysis while awarding plenty of Phds to those involved.
Rediscovery is a great thing. You bring new meaning and context. I’s just not “expanding circle of knowledge”
More likely is you will dig further down the track of the fads your advisor is into. The trend will be forgotten in a few decades, with a small change of unforeseen utility later. And its contribution will be to your personal life.
The model proposed is also lacking in ambition because historically PhDs were significant.
I'm considering getting a masters or PhD (in PL) under a professor I work with now for my undergrad thesis. It has been my observation that the standard path of getting a standard corporate job tends to nullify all impact you could have (with a few rare exceptions). And after that I could get a job, become a professor, turn my research into a startup etc. The pros are
1) I know my professor and he's a solid guy
2) Pays decently well, money isn't too much of a concern
3) I get paid to do research, university provides generous grants if turned into a startup
Cons
1) Hear a lot of bad things about the academic rat race, pressure to public even at masters/PhD level
2) I could probably hack out some paper into journals but whether I could have any real impact "on demand" (versus say spontaneously coming up with something) is a big question mark, especially within the deadlines given in the program
Any thoughts on this? Especially heuristics, methods or ways to increase impact?
Understand that a PhD is an apprenticeship to become a researcher. You are not expected to do career defining work as a PhD student, and indeed that is unlikely.
Your relationship with your advisor is very important. It seems like you already have that sorted out.
Most successful PhDs (in CS) involve tackling a relatlively small and easy project, usually suggested by your advisor, early on, and then expanding and iterating on this. Once you make some progress on a topic you'll easily find more directions to take it.
Working with other people is one of the easiest ways to increase your productivity. All the great groups I saw had a lot of collaboration. Don't fall into the "lone scholar locked in the library" stereotype.
Avoid bad people. Avoid getting stuck in your own head. Realize a PhD is a project like many others. It doesn't define you. You start it, you work consistently on it, you finish it.
Doing a research Masters is usually a waste of time. Doing a taught Masters is a lot of fun, but something quite different to a PhD.
Thanks for the reply!
>A PhD is an apprenticeship to become a researcher That's a good way to look at it. I suspect one of the biggest possible benefits of a PhD is that you're put in an environment structured to and pressuring you to develop something new, which is the opposite of most other human work.
>Start a relatively small and easy project and collaborate Sound advice, it's the general approach I've taken for my undergrad thesis.
>A research masters is a waste of time, a coursework masters isn't Really? It looks the opposite to me. A research masters let's you collaborate with different people and work on new things. A coursework masters is taking advanced classes.
At least in the UK, a PhD takes one year more than a MRes and it lets you become a university lecturer. It also should be funded, whilst you might have to pay your own way for an MRes. Hence I don't see the point of doing an MRes when you can stay an extra year and have more opportunities afterwards. MRes are usually a consolation prize for people who drop out of their PhD, IME.
In my side of the world I think it's more similar to the USA where a masters is two years and a PhD is four. And it's fifty fifty whether a PhD has a masters or comes straight from undergrad. I'm leaning towards masters because I don't particularly care about the prestige and I don't want to over commit. I don't think the title is important to the impact I have
As a professor many years after the PhD my advice is to do the PhD only if you are genuinely excited / cannot stop yourself from doing research. Only then it will outweigh the negatives of difficulties of getting the jobs, somewhat low pay etc. At least from my point of view I always tried to work on what was interesting to me and what I was good at/or it was interesting to learn vs optimising what is more high profile/sexy. I don't think it is a universal advice but at least I always enjoyed what I did
I can only second this after having advised a few students from bachelor to PhD level. The ones who do well are (usually) the ones who are genuinely excited. Not only about the thing they're doing, but in general. It really helps getting over the lows.
Furthermore, do not underestimate the importance of sheer luck. Exaggerating a bit, deep learning was just another subfield of ML, until GPU-powered DL really took off and made the researches behind the most fundamental ideas superstars. This is not a given, and it might take years or decades until it's really clear whether you're making an impact or not.
I wish you the best of luck, InkCanon, and stay excited!
What do you mean by cannot stop doing research? I certainly haven't discovered anything new, but I love learning new things, reading about ideas, coming up with them.
I meant that you tend to spend free time on that as opposed to treating it like 9 to 5 job. and again it is important that if you do that, it is because you just want to see what comes out of your experiment/learn a new thing etc rather than because you have to publish or is forced by your advisor.
You'll do great. This will eventually turn into new discoveries if you keep at it.
Why not just study a bunch of different things to Master's level then? Learning something genuinely new seems like it has a much lower return to effort.
Good question. IMO
1) There's a kind of "hard" learning you're learning a fixed, structured way from a textbook.
2) There's a kind of "soft" learning which is transmission of knowledge, which happens a lot more face to face when you're working together.
3) Then there's a kind of research learning, where you're doing something new, usually with collaborators.
The second and third are really best done in certain environments like research or good companies
>getting a standard corporate job tends to nullify all impact you could have
It's very strange to me that you think other people would pay you millions or tens of millions of dollars over an average 30- or 40-year career, without you generating at least that amount of value back to the external world as a whole, and probably generating some huge multiple more, and yet all that counts as "no impact" to you. Especially when your comparison point isn't oncology or something, but doing research in PL theory of all things.
But I thank you for giving me the opportunity to get a little riled up on a lazy Sunday morning, it's one of my favorite hobbies. My recommendation to you for "increasing [overall] impact" is to read https://80000hours.org/ and follow their advice, and for "increasing impact [in this niche I really care about]" is simply to be more bounded with your claims.
>people would pay millions over a career without you generating at least that value back
Some of it is empirical observation. I've seen many friends at big/elite tech firms get paid to do very little. Many claims online to that effect, although I weigh it lesser. And I think it's completely plausible. I think because of the exponential advancement of technology, huge accrual of capital and inability of human incentive structures to keep up, value does not universally equal money. IMO many examples. Many people are tech firms do things that are very loosely related to revenue generation - so you can almost double your headcount during COVID, fire tons of them and still function the same (a substantial amount of hiring and firing was tech companies FOMOing about each othe). Meta's VR division has burned through $50 billion, but it's people got paid incredible salaries. One in three Nvidia employees are now worth over 20 million. Many of them were working decent jobs making GPUs for video games and suddenly because of AI, their net worth went up 100x. Oncology is another possible example. By far the wealthy people today are all in computers, instead of curing cancer.
I'm not saying these people are bad or anything like that. The other part of the equation, wealth as a signal, has become incredibly noisy. In some areas it is still a strong signal, typically smaller companies and startups where providing value is a lot more closely related to what you make. And conversely, I don't agree with money generated being a signal of impact in itself.
>I've seen many friends at big/elite tech firms get paid to do very little.
What matters is the outcome, not the amount of effort one puts in. If you're working at e.g. Google for $200,000 a year, your changes can affect millions to billions of people. At that scale even a small improvement like making Google Sheets load 1% faster can equate to millions of dollars of additional revenue downstream -- and likely tens of millions of dollars of actual value, since the largest tech companies actually capture only a low percentage of the value they create for their consumers.
You've just justified that $200k several times over for what might amount to two or three day's worth of effort for you, that's true. That's not a bug - that's a feature of working in a successful, scalable business. If you're inclined to do more than this "bare minimum" which you observe so many doing, just imagine how much value you could create for others if you actually worked close to your capacity in such a place.
>[B]ecause of the exponential advancement of technology, huge accrual of capital and inability of human incentive structures to keep up, value does not universally equal money.
I don't understand the thread of logic here. Claiming that human incentive structures are "unable to keep up" with value creation suggests to me that money is, if anything, a heavily lagging indicator of the real value one is generating, which is in line with the point above. But I don't think that is the point you are trying to make.
>Meta's VR division has burned through $50 billion, but it's people got paid incredible salaries.
Most company actions are bets that the company's leadership think are net positive. Sometimes those bets don't pan out the way we expect them to - that's normal. Your own research might take longer than you expect it to, but that in itself isn't a reason to look back and say you made a bad bet.
As for the people, yes, you generally have to pay a lot to get top talent, and even that doesn't assure you of success. That's probably 2-4 years, out of a 30- or 40-year career, where their contributions may have been net negative to the bottom line. Maybe. If we include caveats like "Meta VR never becomes profitable in the future, either" and "none of the innovations from Meta VR turn out to be profit-generating via a different, unexpected mechanism". This probably equalizes out over the course of a career for the vast majority of these engineers. Not exactly a ship sinker.
>One in three Nvidia employees are now worth over 20 million. Many of them were working decent jobs making GPUs for video games and suddenly because of AI, their net worth went up 100x.
AI is hugely, hugely useful for all kinds of people. I use it every day both professionally and personally. Almost everyone I know does the same. If you truly derive no value at all from it, you are decidedly in the minority.
Is the claim here that they shouldn't have made money off of helping to manufacture the hardware that enables this invention which so many have found so enormously useful? Or maybe it's that since they never intended for their hardware to be useful for such a thing, their involvement should be worth less. That sounds way more like trying to invent a human incentive structure that can't keep up with the exponential advancement of technology than what we actually have. The current incentive structure, however, is wonderfully open to serendipity like this.
>The other part of the equation, wealth as a signal, has become incredibly noisy.
You've just given two examples where one company's wealth fell up to $50b because they made a bet on something that (for now) nobody wants, and another company's wealth went so high that a plurality of their employees are now millionaires because they made something everyone wants. That doesn't sound like a low signal-to-noise ratio to me.
>What matters is the outcome, not the effort
>At certain companies the scale could be enormous
The latter is true and I think the most legitimate reason for working at big companies. I should specify in the first they also accomplish little and affect very little. Things like internal tools that went nowhere, running basic ETL scripts, things like updating financial trade systems to comply with some new regulation. And this at a pretty slow pace.
My meaning about Nvidia and Meta VR is how people who didn't create value got enormously wealthy anyway. In Nvidia's case, traditional GPU teams (which I suspect received most of the benefit because they've vested the longest and made up most of Nvidia's pre AI boom) got hugely rewarded by data center GPUs, which they played little role in. Conversely Meta's VR team still got paid really well (their stock is even up because of AI hype, despite VR losses) despite their failure. So you have these systems where even if you fail or don't play any role in success, you're still paid enormously well. This is because companies capture the value, then distribute in their very imperfect ways.
You're right that the valid reason for this is that tech companies act as risk absorbing entities by paying people to take high risk bets. But the necessary condition for these are
1) Hiring really good people (not just intelligent, but really motivated, curious, visionary etc)
2) A culture which promotes that
The on the ground reality of 1) is that it's a huge mess. The system has turned into a giant game. There are entire paid courses advertised to get a job in MAANG. The average entrant to MAANG spends six to eight months somersaulting through leetcode questions, making LinkedIn/Twitter/YouTube clones, doing interview prep, etc etc. Many causes for this, including the bureaucratization of tech companies, massive supply of fresh grads, global economic disparities, etc. It's no longer the days when nerds, hackers and other thinkers drifted together.
2) Because of FOMO, AI hype and frankly general lack of vision from many tech CEOs, it's just a mess. Anything AI is thrown piles of money at (hence the slew of ridiculous AI products). Everything else is under heavy pressure to AI-ify. I've heard from people inside Google has really ended that kind of research culture that produced all the great inventions. There are still great people and teams but increasingly isolated.
> Hear a lot of bad things about the academic rat race, pressure to public even at masters/PhD level
Strongly depends on the advisor and your goals. If you want to stay in academia, some amount of publications is required. Your advisor, especially if he pays your salary, may also push you to publish. If both are not an issue, I guess you can even finish without publications.
> I could probably hack out some paper into journals but whether I could have any real impact "on demand" (versus say spontaneously coming up with something) is a big question mark, especially within the deadlines given in the program
Nobody comes up with good ideas on demand. As you progress in your academic career the rate of ideas (theoretically) grows. That's why you need the advisor: he can come up with ideas at rate sufficient for his students
>advisor might push to publish
That's fair. I'm just cautiously eyeing the likelihood of coming up with something publishable that's not a going through the motions kind of thing.
> The main reason being getting a standard corporate job tends to nullify all impact you could have (with a few rare exceptions).
"Impact" is an ambiguous term, so it's quite vague what you mean. I assume "positive impact on the world and knowledge".
While this mantra is indeed motivational, it can set you up for disappointment, both in corporate as well as research/PhD settings, at the moment you realize how many hurdles there can be (toxic colleagues, bureaucracy, ignorance, etc.).
Also, for this interpretation of "impact", a corporate job can be very impactful as well.
>impact is ambiguous
This is the core of the issue (most replies usually involve some slightly different definitions). I take many definitions of impact, including societal use, contributing to knowledge, etc. But it's much clearer there are many things people do that are low impact, especially in places with a lot of bureaucracy, politics etc.
A corporate job can, but it seems to me as a result of various incentives corporate jobs tend to be compartmentalized, low impact and repetitive. We're also at a down cycle where tech, the historical haven for impact in a job, is scaling back a ton of things to focus on stock prices. If you know of any corporate jobs that do have impact by some definition of it, I'd love to hear it. In my experience these have been mostly startups.
Ask yourself this, has there been any useful Programming Language that has come out of PL research/ Academia in the last 20 years? The only example I can think of is Julia, and it only seems to be used by other academics.
If you’re looking to be impactful, you are much better off joining a job and working in your free time, than doing a PhD. A PhD is a program to compete for academic prestige. Grad students want to publish papers that get them noticed at conferences, invited to talks at prestigious universities etc, those are the incentives, always has been in academia. The brightest minds join academia because they care more about prestige than money (as they should, anyone can earn money, few can win a Nobel prize). In a healthy academic system, prestige is linked to real world societal impact. That is still somewhat true in fields like Machine Learning, in some fields it seems to be completely dis-aligned from any real world impact whatsoever (which seems to be PL research). Our academic system unfortunately is a rotten carcass.
You could still, advisor willing, do research that interests you and not care at all whether you get noticed by conferences/ journals, your peers etc. But that takes a certain level of anti-social behavior that very few humans possess and so I say join a job. Plenty of companies are still building programming languages, like Google, Apple etc which are being used by engineers worldwide and if you finagle your way into a job at those teams, you will have a meaningful, impactful job, which is also well paying as a side bonus.
> has there been any useful Programming Language that has come out of PL research/ Academia in the last 20 years?
The goal of PL research is not, usually, to produce languages that see commercial adoption but to produce ideas that industry adopts. You cannot say a language like Rust is not influenced by PL research.
No, I can very strongly claim that I doubt any of the modern languages like Rust, Go etc have been influenced by the trainwreck, that is programming language research.
PL research today is actually the study of something called “type theory,” whose relation to the act of building programming languages is the same relation a math PhD has to a carpenter. You will be a great mathematician if you do PL research but I would prefer if you do it in the maths department and not con us into believing it has something to do with programming languages. This is apparently what undergrads are taught in a compilers course: https://x.com/deedydas/status/1846366712616403366 I rest my case. (imagine the grad course syllabus)
On the fringes, you might find research groups who are doing interesting useful stuff in programming languages, but that is an exception to the rule. Which is probably why, you never hear any of the new language developers ever cite programming language research.
There is much more to PL research than "type theory". Look for instance at POPL 2024 program [1].
Also Rust has been influenced by type theory. Rust first compiler was written in OCaml and the influence of OCaml/Haskell (and many other languages [2]) is pretty clear.
Goal of PL research isn't to design programming languages but academic research has a lot of influence on programming languages.
[1] https://popl24.sigplan.org/program/program-POPL-2024/ [2] https://news.ycombinator.com/item?id=34704772)
Edit: regarding https://x.com/deedydas/status/1846366712616403366?mx=2 these are just the formal specs of a type checker. Nothing magic or very complicated there, it's just a notation. Anyone who can understand and implement a type checker should be able to understand this notation as well.
The creator of Rust in his own words:
“ Introducing: Rust Rust is a language that mostly cribs from past languages. Nothing new. Unapologetic interest in the static, structured, concurrent, large-systems language niche, Not for scripting, prototyping, or casual hacking, Not for research or exploring a new type system, Concentrates on known ways of achieving: More safety, More concurrency, Less mess, Nothing new? Hardly anything. Maybe a keyword or two, Many older languages better than newer ones: e.g., Mesa (1977), BETA (1975), CLU (1974) … We keep forgetting already-learned lessons., Rust picks from 80s/early 90s languages: Nil (1981), Hermes (1990), Erlang (1987), Sather (1990), Newsqueak (1988), Alef (1995), Limbo (1996), Napier (1985, 1988).”
If modern PL research is trying to take credit for the latest hot programming language (which I doubt they are, it’s only internet commentators who have nothing to do with PL research who argue with me. Actual PL researchers don’t care about Rust), they should be embarrassed.
Thank you for linking latest PL research, it has been a while since I’ve gone through it, glad to see nothing has changed. Ask yourself, how many of those talks in day 1, have accompanying code? is it even 25%?
For giggles I decided to peruse through “Normal bisimulations by Value”. A 54 page dense paper with theorems, equations and lemmas. Lol, what are we even doing here? You can also notice that they don’t bother justifying their research in the intro or the abstract, claiming relevance to any actual programming language. They themselves realize it’s just math, and PL researchers has become a euphemism for math. Frankly, even one such paper being accepted to a PL conference tells me something is going awry in the field, but if a majority of papers are like this, then the field is a wasteland, that only serves to grind young talented minds into spending their lives chasing academic prestige with no value to society.
> Ask yourself, how many of those talks in day 1, have accompanying code? is it even 25%?
57 out of 93 papers (61%) published at POPL 24 have an artifact available. Note that this may also be automated proofs etc, it's not necessarily "running code".
But I also think focusing on POPL as a representation of the PL community isn't entirely fair. POPL is the primary conference focused on type systems within the PL community. It's a niche within a niche. Conferences like OOPSLA, ECOOP, or ICFP are much broader and much less likely to be so focused on mathematical proofs.
I asked Claude to go through all paper names and estimate how many have code vs how many are proofs:
“Based on my analysis, I estimate: - ~35-40 papers (roughly 35%) likely have significant accompanying code - ~55-60 papers (roughly 65%) are primarily theoretical/mathematical proofs “
I suspect even the remaining 35% doesn’t have much to do with programming languages, and I don’t think these stats change much for other conferences.
> I don’t think these stats change much for other conferences.
I'd severely doubt that: there is a large difference in focus on theory vs practice between conferences. POPL really is one of the more theoretical conferences. At a conference like ECOOP, you're unlikely to see many proofs (I'd guess at most 20% of papers, based on personal experience).