Tag Archives: Opinion

The Cognitive Science Age

namib

Complex patterns in the Namib desert resemble neural networks.

The history of science and technology is often delineated by paradigm shifts. A paradigm shift is a fundamental change in how we view the world and our relationship with it. The big paradigm shifts are sometimes even referred to as an “age” or a “revolution”. The Space Age is a perfect example. The middle of the 20th Century saw not only an incredible increase in public awareness of space and space travel, but many of the industrial and technical advances that we now take for granted were byproducts of the Space Age. 

The Cognitive Science Age

It’s probably cliche to write this but I believe we are at the beginning of a new age, and a new, profound paradigm shift. I think we’re well into the Cognitive Science Age. I’m not sure anyone calls it that, but I think that is what truly defines the current era. And I also think that an understanding of Cognitive Science is essential for understanding our relationships with the world and with each other. 

I say this because in the 21st century, artificial intelligence, machine learning, and deep learning are now being fully realized. Every day, computers are solving problems, making decisions, and making accurate predictions about the future…about our future. Algorithms decide our behaviours in more ways that we realize. We look forward to autonomous vehicles that will depend of the simultaneous operation of many computers and algorithms. Machines will (and have) become central to almost everything.

And this is a product of Cognitive Science. As cognitive scientists, this new age is our idea, our modern Prometheus.

Cognitive Science 

Cognitive Science is an interdisciplinary field that first emerged in the 1950s and 1960s and sought to study cognition, or information processing, as its own area of study rather than as a strictly human psychological concept. As a new field, it drew from Cognitive Psychology, Philosphy, Linguistics, Economics, Computer Science, Neuroscience, and Anthropology. Although people still tend to work and train in those more established traditional fields, it seems to me that society as a whole is in debt to the interdisciplinary nature of Cognitive Science. And although it is a very diverse field, the most important aspect in my view is the connection between biology, computation, and behaviour.

The Influence of Biology

A dominant force in modern life is the algorithm, as computational engine to process information and make predictions. Learning algorithms take in information, learn to make associations, make predictions from those associations, and then adapt and change. This is referred to as machine learning, but the key here is that machines learn biologically,

For example, the algorithm (Hebbian Learning) that drives machine learning was discovered by the psychologist and neuroscientist Donald Hebb at McGill university. Hebb’s book on the The Organization of Behaviour  in 1949 is one of the most important books written in this field and explained how neurons learn associations. This concept was refined mathematically by the Cognitive Scientists Marvin Minsky, David Rumlehart, James McLelland, Geoff Hinton, and many others. The advances we see now in machine learning and deep learning are a result of Cognitive Scientists learning how to adapt and build computer algorithms to match algorithms already seen in neurobiology. This is a critical point: It’s not just that computers can learn, but that the learning and adaptability of these systems is grounded in an understanding of neuroscience. That’s the advantage of an interdisciplinary approach.

The Influence of Behaviour 

As another example, the theoretical grounding for the AI revolution was developed by Allen Newell (a computer scientist) and Herbert Simon (an economist). Their work in the 1950s-1970 to understand human decision making and problem solving and how to model it mathematically is provided a computational approach that was grounded in an understanding of human behaviour. Again, this an advantage of the interdisciplinary approach afforded by Cognitive Science. 

The Influence of Algorithms on our Society 

Perhaps one of the most salient and immediately present ways to see the influence of Cognitive Science is in the algorithms that drive the many products that we use online. Google is many things, but at its heart, it is a search algorithm and a way to organize the knowledge in the world so that the information that a user needs can be found. The basic ideas of knowledge representation that underlie Google’s categorization of knowledge were explored early on by Cognitive Scientists like Eleanor Rosch and John Anderson in the 1970s and 1980s. 

Or consider Facebook. The company runs and designs a sophisticated algorithm that learns about what you value and makes suggestions about what you want to see more of. Or, maybe more accurately, it makes suggestions for what the algorithm predicts will help you to expand your Facebook network… predictions for what will make you use Facebook more. 

In both of these cases, Google and Facebook, the algorithms are learning to connect the information that they acquire from the user, from you, with the existing knowledge in the system to make predictions that are useful and adaptive for the users, so that the users will provide more information to the system, so that it can refine its algorithm and acquire more information, and so on. As the network grows, it seeks to become more adaptive, more effective, and more knowledgeable. This is what your brain does, too. It causes you to engage in behaviour that seeks information to refine its ability to predict and adapts. 

These networks and algorithms are societal minds; They serve the same role for society that our own network of neurons serves our body. Indeed, these algorithms can even  change society. This is something that some people fear. 

Are Fears of the Future Well Founded?

When tech CEOs and politicians worry about the dangers of AI, I think that idea is at the core of their worry. The idea that the algorithms to which we entrust increasingly more of our decision making are altering our behaviour to serve the algorithm in the same way that our brain alters our behaviour to serve our own minds and body is somethings that strikes many as unsettling and unstoppable. I think these fears are founded and unavoidable, but like any new age or paradigm shift, we should continue to approach and understand this from scientific and humanist directions. 

The Legacy of Cognitive Science

The breakthroughs of the 20th and 21st centuries arose as a result of exploring learning algorithms in biology, the instantiation of those algorithms in increasingly more powerful computers, and the relationship of both of these concepts to behaviour. The technological improvements in computing and neuroscience have enabled these ideas to become a dominant force in the modern world. Fear of a future dominated by non-human algorithms and intelligence may be unavoidable at times but and understanding of Cognitive Science is crucial to being able to survive and adapt.

 

Open Science: My List of Best Practices

IMG_20180708_144620_Bokeh

This has nothing to do with Open Science. I just piled these rocks up at Lake Huron

Are you interested in Open Science? Are you already implementing Open Science practices in your lab? Are you skeptical of Open Science? I have been all of the above and some recent debates on #sciencetwitter have been discussing the pros and cons of Open Science practices. I decided to write this article to share my experiences as I’ve been pushing my own research in the Open Science direction.

Why Open Science?

Scientists have a responsibility to communicate their work to their peers and to the public. This has always been part of the scientific method but the methods of communication have differed throughout the years and differ by fields. This essay reflects my opinions on Open Science (capitalized to reflect this as set of principles), and I also give an overview of my lab’s current practices. I’ve written about this in my lab manual (which is also open) but until I sat down to write this essay, I had not really codified how my lab and research has adopted and Open Science practice. This should not be taken as a recipe for your own science, lab, and these ideas may not apply to other fields. This is just my experience trying to adopt Open Science practices in my Cognitive Psychology lab.

Caveats First

Let’s get a few things out of the way…

First, I am not an expert in open science. In fact until about 2-3 years ago, it never even occurred to me to create a reproducible archive for my data, or to ensure that I could provide analysis scripts to someone else so that they could reproduce my analysis, or that I would provide copies of all of the items / stimuli that I used in a psychology experiment. I’ve received requests for data before, but I usually handled those in a piecemeal, ad hoc fashion. If someone asked, I would put together a spreadsheet.

Second, my experience is only generalizable to other comparable fields. I work in cognitive psychology and have collected behavioural data, survey questionnaire data, and electrophysiological data. I realized data sharing can be complicated by ethics concerns for people who collect sensitive personal or health data. I realize that other fields collect complex biological data that may not lend itself well to immediate sharing.

Finally, the principles and best practices that I’m outlining here were adopted in 2018. Some of this was developed over the course of the last few years, but this is how we are running our lab now, and how we plan to run my research lab foreseeable future. That means there are still gaps: studies that were published a few years ago that have not yet been archived, papers that may not have a preprint, analyses that were done 20 years ago in SAS on the VAX 11/780 at University at Buffalo, and if anyone wants to see data from my well-cited 1998 paper on prototype and exemplar theory, I can get it, but it is not going to be easy.

Core Principles

There are many aspects to Open Science, but I am going to outline three areas that cover most of these. There will be some overlap and some aspects may be missed.

Materials and Methods

The first aspect of Open Science concerns openness with respect to methods, materials, and reproducibility. In order to satisfy this criteria, a study or experiment should be designed and written in such a way that another scientist or lab in the same field would be able to carry out the same kind of study if they wanted to. That means that any equipment that was used is described in enough detail or is readily available. This also means that computer programs that were used to carry out the study are accessible and the code is freely available. As well, in psychology, there are often visual, verbal, or auditory stimuli that participants make decisions about or questions that they answer. These should also be available.

Data and Analysis

The second aspect of Open Science concerns open availability of data that have been collected in the study. In psychology, data takes many forms, but usually refers to responses by participants on surveys, presentation of visual stimuli, recordings of EEG, data collected in an fMRI study. In other fields, it may consist of observations taken at a field station, measurements taken of an object or substance, or trajectories of objects in space. Anything that is measured, collected, analyzed for a publication should be available for other scientists in the field.

Of course, in a research study or scientific project, the data that have been collected are also processed and analyzed. Here, several decisions need to be made. It may not always be practical to share raw data, especially if things were recorded by hand in a notebook or if the digital files are so large as to be unmanageable. On the other hand, it may not be useful to publish data that have been processed and summarized too much. For most fields, there is probably a middle-ground where the data have been cleaned and minimally processed but no statistical analyses of been done, and the data have not been transformed. The path from raw data to this minimal state should be clear and transparent. In my experience so far, this is one of the most difficult decisions to make. I don’t have a solid answer yet.

In most scientific fields, data are analyzed using software and field-specific statistical techniques. Here again, several decisions need to be made while the research is being done in order to ensure that the end result is open and usable. For example, if you analyze your data with Microsoft Excel, what might be simple and straightforward to you might be uninterpretable to someone else. This is especially true if there are pivot tables, unique calculations entered into various cells, and transformations that have not been recorded. This, unfortunately, describes a large part of the data analysis I did as a graduate student in the 1990s. And I’m sure I’m not alone. Similarly, any platform that is proprietary will present limits to openness. This includes Matlab, SPSS, SAS, and other popular computational and analytic software. I think that’s why you see so many people who are moving towards Open Science practices encouraging the use of R and Python, because they are free, openly available, and they lend themselves well to scientific analysis.

Publication

The third aspect of Open Science concerns the availability of the published data and interpretations: the publication itself. This is especially important for any research that is carried out at a university or research facility that is supported by public research grants. Most of these funding agencies require that you make your research accessible.

There are several good open access research journals that make the publications freely available for anyone because the author helps to cover the cost of publication. But many traditional journals are still behind a paywall and are only available for paid subscribers. You may not see the effects of this if you’re working in a university because your institution may have a subscription to the journal. The best solution is to create a free and shareable version of your manuscript, a preprint, that is available on the web and that anyone can access but does not violate the copyright of the publisher.

Putting this in practice

I tried to put some guidelines in place in my lab to address these three aspects of open science. I started with one overriding principle: When I submit a manuscript for publication in a peer-reviewed journal, I should also ensure that at the time of submission, I have a complete data file that I can share, analysis scripts that I can share, and a preprint.

I implemented as much of this is possible with every project paper that we’ve submitted for publication since late 2017 and all our ongoing projects. We don’t submit a manuscript until we can meet the following:

  • We create a reprint of the manuscript that can be shared via a public online repository. We post this preprint to the online repository at the same time that we submit it to the journal.
  • We create shareable data files for all of the data collected in the study described in that manuscript. These are almost always unprocessed or minimally processed data in a Microsoft Excel spreadsheet or a text file. We don’t use Excel for any summary calculations, so the data are just data.
  • As we’re carrying out the data analysis, we document our analyses in R notebooks. We share the R scripts /notebooks for all of the statistical analyses and data visualizations in the manuscript. These are open and accessible and should match exactly what appears the manuscript. In some cases, we have posted R notebooks with additional data visualization beyond what is in the manuscript as a way to add value to the manuscript.
  • We also create a shareable document for any nonproprietary assessments or questionnaires that were designed for this study and copies of any visual or auditory stimuli used in the study.

Now on this list of best practices, it would be disingenuous to suggest that every single study paper from my lab meets all of those criteria. For example, one recently published study made use of Matlab instead of Python, because that’s how we knew how to analyze the data. But we’re using these principle as a guide as out work progresses. I view Open Science and these guidelines as an important and integral part of training my students. I view this as being just as important as the theoretical contributions that we’re making to the field.

Additional Resources and Suggestions

In order to achieve this goal, the following guidelines and resources have been helpful to me.

OSF

My public OFS profile lists current and recent projects. OSF stands for “open science Framework” and it’s one of many data repositories that can be used to share data, preprints, unformatted manuscripts, analysis code, and other things. I like OSF, and it’s kind of incredible to me that thus wonderful resource is free for scientists to use. But if you work at a University or public research institute, your library probably runs a public repository as well.

Preregistration

For some studies, preregistration may be helpful, additional step in carrying out the research. There are limits to preregistration, many of which are addressed with Registered Reports. At this point, we haven’t done any register reports. The preregistration is helpful though, because it encourages the researcher student to lay out a list of analyses they plan to do, to describe how the data are going to be collected, and to make that plan publicly available before the data are collected. This doesn’t mean that preregistered studies are necessarily better, but it’s one more tool to encourage openness in science.

Python and R

If you’re interested in open science it really is worth looking closely at R and Python for data manipulation, visualization, and analysis. In psychology, for example, SPSS has been a long-standing and popular way to analyze data. SPSS does have a syntax mode that allows the researcher to share their analysis protocol, but that mode of interacting with the program is much less common than the GUI version. Furthermore, SPSS is proprietary. If you don’t have a license, you can’t easily look at how the analyses were done. The same is true of data manipulation in Matlab. My university has a license, but if I want to share my data analysis with a private company, they may not have a license. But anyone in the world can install and use R and Python.

Conclusion

Science isn’t a matter of belief. Science works when people trust in the methodology, the data and interpretation, and by extension, the results. In my view, Open Science is one of the best ways to encourage scientific trust and to encourage knowledge organization and synthesis.

Cognitive Bias and the Gun Debate

171017-waldman-2nd-amendment-tease_yyhvy6

image from GETTY

I teach a course at my Canadian university on the Psychology of Thinking and in this course, we discuss topics like concept formation, decision making, and reasoning. Many of these topics lend themselves naturally to the discussion of current topics and in one class last year, after a recent mass shooting in the US, I posed the following question:

“How many of you think that the US is a dangerous place to visit?”

About 80% of the students raised their hands. This is surprising to me because although I live and work in Canada and I’m a Canadian citizen, I grew up in the US; my family still lives there and I still think it’s a reasonably safe place to visit. Most students justified their answer by referring to school shootings, gun violence, and problems with American police. Importantly, none of these students had ever actually encountered violence in the US. They were thinking about it because it has been in the news. That were making a judgment on the basis of the available evidence about the likelihood of violence.

Cognitive Bias

The example above is an example of a cognitive bias known as the Availability Heuristic. The idea, originally proposed in the early 1970s by Daniel Kahneman and Amos Tversky (Kahneman & Tversky, 1979; Tversky & Kahneman, 1974) is that people generally make judgments and decisions on the basis of the most relevant memories that they retrieve and that are available at the time that the assessment or judgement is made. In other words, when you make a judgment about a likelihood of occurrence, you search your memory and make your decision on the basis of what you remember. Most of the time, this heuristic produces useful and correct evidence. But in other cases, the available evidence may not correspond exactly to evidence in the world. For example, we typically overestimate the likelihood of shark attacks, airline accident, lottery winning, and gun violence.

Another cognitive bias (also from Kahneman and Tversky) is known as the Representativeness Heuristic. This is the general tendency to treat individuals as representative of their entire category. For example, suppose I formed concept of American gun owners as being violent (based on what I’ve read or seen in the news), I might infer that each individual American is a violent gun owner. I’d be making a generalization or a stereotype and this can lead to bias in how a treat people. As with availability, the representativeness heuristic arrises out of the natural tendency of humans to generalize information. Most of the time, this heuristic produces useful and correct evidence. But in other cases, the representative evidence may not correspond exactly to individual evidences in the world.

The Gun Debate in the US

I’ve been thinking about this a great deal as the US engages in their ongoing debate about gun violence and gun control. It’s been reported widely that the US has the highest rate of private gun ownership in the world, and also has an extraordinary rate of gun violence relative to other counties. These are facts. Of course, we all know that “correlation does not equal causation” but many strong correlations often do derive from a causal link. The most reasonable thing to do would be to begin to implement legislation that restricts access to firearms but this never happens and people are very passionate about the need to restrict guns.

So why to do we continue to argue about this? One problem that I rarely see being discussed is that many of us have limited experience with guns and/or violence and have to rely on what we know from memory and from external source and we’re susceptible to cognitive biases.

Let’s look at things from the perspective of an average American gun owner. This might be you, people you know, family, etc. Most of these gun owners are very responsible, knowledgeable, and careful. They own firearms for sport and also for personal protection and in some cases, even run successful training courses for people to learn about gun safety. From the perspective of a responsible and passionate gun owner, it seems to be quite true that the problem is not guns per se but the bad people who use them to kill others. After all, if you are safe with your guns and all your friends and family are safe, law abiding gun owners too, then those examples will be the most available evidence for you to use in a decision. And so you base your judgements about gun violence on the this available evidence and decide that gun owners are safe. As a consequence, gun violence is not a problem of guns and their owners, but must be a problem of criminals with bad intentions. Forming this generalization is an example of the availability heuristic. It my not be entirely wrong,  but it is a result of a cognitive bias.

But many people (and me also) are not gun owners. I do not own a gun but I feel safe at home. As violent crime rates decrease, the likelihood being a victim of a personal crime that a gun could prohibit is very small, Most people will never find themselves in this situation. In addition, my personal freedoms are not infringed by gun regulation and I too recognize that illegal guns are a problem. If I generalize from my experience, I may have difficulty understanding why people would need a gun in the first place whether for personal protection or for a vaguely defined “protection from tyranny”. From my perspective it’s far more sensible to focus on reducing the number of guns. After all, I don’t have one, I don’t believe I need one, so I generalize to assume that anyone who owns firearms might be suspect or irrationally fearful. Forming this generalization is also an example of the availability heuristic. It my not be entirely wrong,  but it is a result of a cognitive bias.

In each case, we are relying on cognitive biases to infer things about others and about guns. These things and inferences may be stifling the debate

How do we overcome this?

It’s not easy to overcome a bias, because these cognitive heuristics are deeply engrained and indeed arise as a necessary function of how the mind operates. They are adaptive and useful. But occasionally we need to override a bias.

Here are some proposals, but each involves taking the perspective of someone on the other side of this debate.

  1. Those of us on the left of the debate (liberals, proponents of gun regulations) should try to recognize that nearly all gun enthusiasts are safe, law abiding people who are responsible with their guns. Seen through their eyes, the problem lies with irresponsible gun owners. What’s more, the desire to place restrictions on their legally owned guns activates another cognitive bias known as the endowment effect in which people place high value on something that they already possess, the prospect of losing this is seen as aversive because it increases the feeling of uncertainty for the future.
  2. Those on the right (gun owners and enthusiasts) should consider the debate from the perspective of non gun owners and consider that proposals to regulate firearms are not attempts to seize or ban guns but rather attempts to address one aspect of the problem: the sheer number of guns in the US, any of which could potentially be used for illegal purposes. We’re not trying to ban guns, but rather to regulate them and encourage greater responsibility in their use.

I think these things are important to deal with. The US really does have a problem with gun violence. It’s disproportionally high. Solutions to this problem must recognize the reality of the large number of guns, the perspectives of non gun owners, and the perspectives of gun owners. We’re only going to do this by first recognizing these cognitive biases and them attempting to overcome them in ways that search for common ground. By recognizing this, and maybe stepping back just a bit, we can begin to have a more productive conversation.

As always: comments are welcome.

How do you plan to use your PhD?

If you follow my blog or medium account, you’ve probably already read some of my thoughts and musings on the topic of running a research lab, training graduate students, and being a mentor. I think I wrote about that just a few weeks ago. But if you haven’t read any of my previous essays, let me provide some context. I’m professor of Psychology at a large research university in Canada, the University of Western Ontario. Although we’re seen as a top choice for undergraduates because of our excellent teaching and student life, we also train physicians, engineers, lawyers, and PhD students in dozens of field. My research group fits within the larger area of Cognitive Neuroscience which is one of our university’s strengths.

Within our large group (Psychology, the Brain and Mind institute, BrainsCAN, and other groups) we have some of the very best graduate students and postdocs in the world, not to mention some of my excellent faculty colleges. I’m not writing any of this to brag or boast but rather to give the context that we’re a good place to be studying cognition, psychology and neuroscience.

And I’m not sure any of our graduates will ever get jobs as university professors.

The Current State of Affairs

Gordon Pennycook, from Waterloo and soon from University of Regina wrote an excellent blog post and paper on the job market for cognitive psychology professors in Canada. You might think this is too specialized, but he makes the case that we can probably extrapolate to other fields and counties and find the same thing. But since this is my field (and Gordon’s also) it’s easy to see how this affects students in my lab and in my program.

One thing he noted is that the average Canadian tenure-track hire now has 15 publications on their CV when hired. That’s a long CV and as long as long as what I submitted in my tenure dossier in 2008. It’s certainly a longer CV than what I had when I was hired at Western in 2003. I was hired with 7 publications (two first author) after three years as a postdoc and three years of academic job applications. And it’s certainly longer than what the most eminent cognitive psychologists had when they were hired. Michael Posner, whose work I cite to this day, was hired straight from Wisconsin with one paper. John Anderson, who’s work I admire more than any other cognitive scientists, was hired with a PhD from Yale and 5 papers on his CV. Nancy Kanwisher was hired in 1987 with 3 papers from her PhD at UCLA.

Compare that to a recent hire in my own group, who was hired with 17 publications in great journals and was a postdoc for 5 years. Or compare that to most of our recent hires and short-listed applicants who have completed a second postdoc before they were hired.  Even our postdoctoral applicants, people applying for 2-3 year postdocs at my institution, are already postdocs and are looking to get a better postdoc to get more training and become more competitive.

So it’s really a different environment today.

The fact is, you will not get a job as a professor after finishing a PhD. Not in this field and not in most fields. Why do I say this? Well for one, it’s not possible to publish 15-17 papers during your PhD career. Not in my lab, at least. Even if added every student to every paper I published, they will not have a CV with that many papers, I simply can’t publish that many papers and keep everything straight. And I can’t really put every student on every paper anyway. If the PhD is not adequate for getting a job as a professor, what does that mean for our students, our program, and for PhD programs in general?

Expectation mismatch

Most students enter a PhD program with the idea of becoming a professor. I know this because I used to be the director of our program and that’s what nearly every student says, unless they are applying to our clinical program with the goal of being a clinician. If students are seeking a PhD to become a professor, but we can clearly see that the PhD is not sufficient, then students’ expectations are not being met by our program. We admit student to the PhD with most hoping to become university professors and then they slowly learn that it’s not possible. Our PhD is, in this scenario, merely an entry into the ever-lengthening postdoc stream which is where you prepare to be a professor. We don’t have well-thought out alternatives for any other stream.

But we can start.

Here’s my proposal

  1. We have to level with students and applicants right away that “tenure track university professor” is not going to be the end game for PhD. Even the very best students will be looking at 1-2 postdocs before they are ready for that. For academic careers, the PhD is training for the postdoc in the same way that med school is training for residency and fellowship.
  2. We need to encourage students to begin thinking about non-academic careers in their first year. This means encouraging students’ ownership of their career planning.  There are top-notch partnership programs like Mitacs and OCE (these are Canadian but programs like this exist in the US, EU and UK) that help students transition into corporate and industrial careers. We have university programs as well. And we can encourage students to look at certificate program store ensure that their skills match the market. But students won’t always know about these things if their advisors don’t know or care.
  3. We need to emphasize and cultivate a supportive atmosphere. Be open and honest with students about these things and encourage them to be open as well. Students should be encouraged to explore non-academic careers and not make to feel guilty for “quitting academia”.

I’m trying to manage these things in my own lab. It is not always easy because I was trained to all but expect that the PhD would lead into a job as a professor. That was not really true when I was a student but it’s even less true now. But I have to to adapt. Our students and trainees have to adapts and it’s incumbent upon us to guide and advice.

I’d be intersted in feedback on this topic.

  • Are you working on a PhD to become a professor?
  • Are you a professor wondering if you’d be able to actually get a job today?
  • Are you training students with an eye toward technical and industrial careers?

 

The Unintended Cruelty of America’s Immigration Policies

static.politico

Image from https://goo.gl/HtfqLa

It is well documented that the Trump administration is pursing a senselessly cruel policy of prosecuting migrants at the border, detaining families, and incarcerating them in large, improvised detention centres. This includes taking children away from their parents and siblings and housing them separately for an extended period.

Pointlessly Cruel

Jeff Sessions has pointed out that this policy is “simply enforcing the law” and that it’s a deterrent. He lays any negative conseqences on the migrant families themselves, asking why they would risk bringing their children on this long and dangerous trek. Other members of the administration have pointed out that families who claim asylum at ports of entry are not being detained or split apart. This too is disingenuous, as the Trump administration has narrowed the reasons for asylum, and as the border has become increasingly militarized, migrants and asylum-seekers are being forced away from busy ports of entry and often into dangerous crossings.

 How did we get to this point? How did a nation which once prided itself on welcoming immigrants become a nation increasingly looking to punish individuals even as they seek asylum? Although some aspects of this cruel policy have long been present in America’s history, I think that particular fixation on migration from Mexico stems from an unintended starting point.

Unintended Consequences

A recent podcast by Malcolm Gladwell explored the causes and effects of the militarized US-Mexico border. I found this podcast fascinating and I recommend listening to it. To summarize, for most of the 20th century, into the 1960s and 1970s, migration between the United States and Mexico was primarily cyclical. Migrants from rural areas near the border in Mexico would move to the United States for work, stay for a few months, and move back to Mexico with their families. This was an economic relationship and it worked because the cost of crossing the border was essentially zero. If you are apprehended, you’d be returned but otherwise it allowed for the flow of migrants into the United States and out of the United States.

In the early 1970s, however, the US-Mexico border began to be militarized. It happened almost by accident. An extremely skilled and dedicated retired Marine General took over immigration and naturalization services and began to tighten up the way in which border patrols operated. There was never any intent to cause suffering.  On the contrary, the original intent seem to be to harmonize border enforcement with existing law  in a way that benefited everyone. But what happened was that as the borders became less porous, migrants began seeking out for dangerous border crossings. Often these were in the high desert where risk of injury and death was higher, as the cost of crossing the border back and forth increased due to this danger, migrants were less likely to engage in cyclical migration but rather stayed in the United States and either send money home to Mexico or brought their families here.

This has profound implications for the current state of affairs. As each successive administration cracks down on illegal immigration, tightens the border, and militarizes the border patrol, it increases the risks and costs associated with crossing back and forth. Migrants still want to come to America, people are still claiming asylum, but illegal immigrants in the United States are persecuted and stay in hiding. Every indication is that the worst possible thing that could be done would be the actual construction of a wall.  In some ways, an analogy can be drawn to desire paths in public spaces. There is a natural flow to collective human behaviour. Civic planning and architecture does not always match, but human behaviour will always win out. People will continue to migrate and this will continue to be a problem.

Gladwell doesn’t say this, but it seems to me that the most rational and humane solution is a porous border. In a porous border, illegal immigrants are turned back when apprehended, but in a straightforward way. People are not apprehended and put into detention centers. Families are not charged with committing a misdemeanour offence and jailed prior to their hearings necessitating the removal of the children. In a porous border, there is still border security but the overall level of enforcement is lower.  In addition, a policy like this could benefit from increased access to green cards,  recognizing that many migrants wish to work in the United States for a few months. Unfortunately, no one in the Southwest (or anywhere else in America) is going to win an election with the promise of “Let’s make our border more porous and engage in lax border security.” That will not sell. But the evidence presented by the Mexican migration project and reviewed by Gladwell in his podcast suggests this would still be the most rational solution.

More Objective Research

This is one of those cases where we need more objective policy research, less political rhetoric. Has anyone asked an algorithm or computer model to determine the ideal level of border security? How much flow is tolerable? How does one balance economic detriment to having a relatively free flow of migrants with the costs associated with apprehension detention and deportation, and any associated criminal proceedings. The latter are expensive and human-resource intensive. Do to the risks of a porous border justify these expenses?

The thing is, these are computational problems. These are problems that demand rigorous computational analysis and not moralistic grandstanding about breaking the law for fears of drugs and criminals poring over the border.

The evidence seems to suggest that for decades, the relatively porous border had no ill effects on American society and was mutually beneficial to the US and to Mexican border regions. Though unintended, the slow militarization of the US-Mexico border restricted migration, made it more dangerous, which led to real costs illegal immigration thus necessitating a stronger more militaristic response, which creates a feedback loop. The harsher the enforcement the worse the problem gets.

The current administration has adopted the harshest enforcement yet, one that in my view is intentionally cruel, is a clear moral failing, and one that may be destined to fail anyway.

The Professor, the PI, and the Manager

Here’s a question that I often ask myself: How much should I be managing my lab?

I was meeting with one of my trainees the other day and this grad student mentioned that they sometimes feel like they don’t know what to do during the work day and that they sometimes feel like they are wasting a lot of their time. As a result, this student will end up going home and maybe working on a coding class, or (more often) doing non grad school things. We talked about what this student is doing and I agreed: they are wasting a lot of time, and not really working very effectively.

Before I go on, some background…

There is no shortage of direction in my lab, or at least I don’t think so. I think I have a lot of things in place. Here’s a sample:

  • I have a detailed lab manual that all my trainees have access to. I’ve sent this document to my lab members a few times, and it covers a whole range of topics about how I’d like my lab group to work.
  • We meet as a lab 2 times a week. One day is to present literature (journal club) and the other day is to discuss the current research in the lab. There are readings to prepare, discussions to lead, and I expect everyone to contribute.
  • I meet with each trainee, one-on-one, at least every other week, and we go though what each student is working on.
  • We have an active lab Slack team, every project has a channel.
  • We have a project management Google sheet with deadlines and tasks that everyone can edit, add things to, see what’s been done and what hasn’t been done.

So there is always stuff to do but I also try not to be a micromanager of my trainees. I generally assume that students will want to be learning and developing their scientific skill set. This student is someone who has been pretty set of looking for work outside of academics, and I’m a big champion of that. I am a champion of helping any of my trainees find a good path. But despite all the project management and meetings this student was feeling lost and never sure what to work on. And so they were feeling like grad school has nothing to offer in the realm of skill development for this career direction. Are my other trainees also feeling the same way?

Too much or too little?

I was kind of surprised to hear one of my students say that they don’t know what to work on, because I have been working harder than ever to make sure my lab is well structured. We’ve even dedicated several lab meetings to the topic.

The student asked what I work on during the day, and it occurred to me that I don’t always discuss my daily routine. So we met for over an hour and I showed this student what I’d been working on for the past week: an R-notebook that will accompany a manuscript I’m writing that will allow for all the analysis of an experiment to be open and transparent. We talked about how much time that’s been taking, how I spent 1-2 days optimizing the R code for a computational model. How this code will then need clear documentation. How the OSF page will also need folders for the data files, stimuli, the experimenter instructions. And how those need to be uploaded. I have been spending dozens of hours on this one small part of one component of one project within one of the several research areas in my lab, and there’s so much more to do.

Why aren’t my trainees doing the same? Why aren’t they seeing this, despite all the project management I’ve been doing?

I want to be clear, I am not trying to be critical in any way of any of my trainees. I’m not singling anyone out. They are good students, and it’s literally my job to guide and advise them. So I’m left with the feeling that they are feeling unguided, with the perception that that there’s not much to do. If I’m supposed to be the guide and they are feeling unguided, this seems like a problem with my guidance.

What can I do to help motivate?

What can I do to help them organize, feel motivated, and productive?

I expect some independence for PhD students, but am I giving them too much? I wonder if my lab would be a better training experience if I were just a bit more of a manager.

  • Should I require students to be in the lab every day?
  • Should I expect daily summaries?
  • Should I require more daily evidence that they are making progress?
  • Am I sabotaging my efforts to cultivate independence by letting them be independent?
  • Would my students be better off if I assumed more of a top down, managerial role?

I don’t know the answers to these questions. But I know that there’s a problem. I don’t want to be a boss, expecting them to punch the clock, but I also don’t want them to float without purpose.

I’d appreciate input from other PIs. How much independence is too much? Do you find that your grad students are struggling to know what to do?

If you have something to say about this, let me know in the comments.

River Water

A simple metaphor

I’ve been reading a lot about privilege, gender, and colonization. I will not even try to pretend to be an expert in this area. But I was thinking about how I am often unaware of my own life and its privilege and the role of luck and chance in all of our lives. The following metaphor / parable is what I came up with. It’s a bit of a clumsy analogy, but I thought it worked on a simple level for me.

We are like rivers

A river flows in the direction that it flows because of many things. Although some rivers are fast, or slow, or deep, or wide, they are all made of the same water. And really, a river is nothing more than water flowing along a course that was created by the water that came before it: the water that created the channel, the water that created the canyon, even the water that is downstream, pulling the river along its course.

The river doesn’t know this. It cannot know the struggles of the earlier river-water that moved the rocks. It cannot know the ease with which the earlier river-water flowed down an unobstructed path. It cannot know that the earlier river-water was obstructed and damned or if a melting glacier helped the earlier river-water to speed its course and deepen its channel. It cannot know that all rivers eventually stop flowing and that all river-water becomes part of the same sea.

All the river can know is it that it is flowing now: flowing quickly or flowing slowly; constrained or unconstrained, oblivious to its own history even as its present course and identity are shaped its history.

We are like rivers in this way. We flow along in our lives, making progress, confronting obstacles, and not always knowing the full context of our our life course.

We should try to understand

But we can try to know more that the river knows. Even as we try to live in the present, we can try to understand how the past shaped the channels and canyons of our life-course. We can see how our current circumstances might make it easier or more difficult depending on the obstacles that previous generations faced. We are the beneficiaries to the sometimes arbitrary circumstances that favoured or did not favour those who came before us. We may also carry the burden of the circumstances imposed on those who came before us. Those of us whose lives flow though clear cut channels may not always realize that we’re travelling a path with fewer obstacles, because those obstacles were removed long before us. We receive these benefits, earned or unearned, aware, or unaware.  But people whose paths are or were constrained or obstructed are often all too aware of the impedance. And like a river that was once blocked or dammed, the effects of the obstruction can be seen and felt long after the impedance was removed.

But we’re all the same river-water, flowing to the same sea. But we don’t all take the same course. We would do well to be aware of our privilege and to understand that we may not all have the same course to travel…but we still have to travel to the same place.

Be mindful of your own trajectory. Be mindful of others.

And help when you can.