A Contributor’s Story with Aanand Kainth

The Contributor’s Story series is designed to give our key open source contributors and community members a face and voice, an overview of the projects they’re working on, and the successes and challenges contributors face in development.

For this blog post, we spoke to Aanand Kainth, a VISSL employee who is working on developing and implementing an event storage mechanism through the Major League Hacking (MLH) Fellowship.

“I want to be able to apply my skills to new areas and learn about writing coupled business logic.”

VISSL is a computer VIsion library for state-of-the-art self-supervised learning research with PyTorch. VISSL aims to accelerate the research cycle in self-supervised learning: from designing a new self-supervised task to evaluating the representations learned. Here are some of Aanand’s thoughts on the process of contributing to VISSL for the first time.

Tell us a little about yourself and your current experience in the MLH scholarship.

As someone who grew up in Silicon Valley, it has always been my dream to work with software. A few weeks after COVID-19 ruled out the option of a summer internship, I noticed the MLH scholarship on a Dev.to ad and applied that same day. And so I spent my summer 2020 working on VISSL and BentoML through the Fellowship.

Where did you first find out about open source? How did you come to use / contribute something?

I got to know open source because I grew up with it so often! My parents introduced me to GIMP instead of Photoshop, and I once wrote my essays in LibreOffice. When I started writing my own code in IDEs like IntelliJ and found that I could fix any problems I had encountered myself, it wasn’t long before I started doing so in many of my jobs

Describe the project you are currently working on.

I worked on Facebook Research’s VISSL, a library of cutting-edge self-paced learning methods. Our larger project this winter and spring designed and implemented an event storage mechanism that makes it easier to perform autopsies on models and monitor them with many different tools such as Tensorboard and HiPlot, without the VISSL code base with if-else clauses litter.

How did you initially go about approaching the problem?

We (my partner Grace and I) worked with our great mentor (Priya) to design an architecture in which all logging backends in VISSL would store their data. We then implemented various event recorders for tensor board, JSON files and standard outputs.

What obstacles or problems have you faced in your post so far?

The VISSL code base is a very complex animal, and both Grace and I were new to self-supervised learning. It was a challenge to grapple with the responsibilities of various components in order to untangle them and make them clearer.

What is the current state of development?

We have opened a pull request on the issue and hope we can close it in the next week.

What have you learned about the project, development or open source so far?

Those few months have been very enlightening, especially for me to understand where I find it difficult to navigate through unfamiliar code bases and to find strategies to keep working, even if I don’t know exactly how some pieces fit together.

I also got extensive practice using advanced Git tools to rewrite histories, resolve 3-way conflicts, etc., with lots of guidance from the VISSL supervisors and mentors.

What advice would you give future contributors to the open source project?

The best advice I can give future contributors is READ. This was probably the most effective way to get familiar with how various tools, or even VISSL itself, work. Read code, read articles, listen to mentors, and share what you learn.

We’d like to thank Aanand for her continued contributions to the Facebook open source ecosystem. You can follow Aanand’s work through their website and GitHub.

If you’d like to learn more about Facebook open source, follow us on Twitter, Facebook, and YouTube for relevant updates, and visit the VISSL website and GitHub to learn how to get started.

Comments are closed.