Tuesday, April 2, 2019

2019 Summer Innovation Program @ MassDiGi


The 2019 MassDiGI Summer Innovation Program (SIP) is a twelve-week internship program that includes the mentoring, guidance, housing, stipends and game development tools to help student teams finalize an original digital game and prepare it for launch.
  • A chance to work on a game prototype from the ground up
  • Work with a team of students to help create the next big hit
  • Industry mentors to help guide you
  • A free place to live and a stipend
This is not just another internship but a chance to hold the fate of a game in your hands.  There are industry mentors and faculty to help you – but in the end it is up to your team to build a successful game.
  • Update: The 2019 SIP application period is now open. Click here to apply. 
  • The application period will close March 20, 2019 at 11:59 pm ET. Successful applicants will be informed on or before March 31, 2019 at 11:59 pm ET.
  • SIP19 runs from May 14 to August 11, 2019. Please read the FAQ's before applying. 

Best Online Learning Platforms Of 2019 - Ars Technica

Best online learning platforms of 2019

Monday, April 1, 2019

Tech Book Face Off: Introduction To Algorithms Vs. Linear Algebra And Differential Equations

This Tech Book Face Off has been a significant undertaking, more so than I originally imagined. It took over ten months to get through these two books, slowly chipping away at them in my free time during the evenings until I had finally read all of the material and worked all (well, nearly all) of the problems. As a result it's been nearly a year since my last book review, and working through these books was quite the experience. The goal at the outset of this project was to revisit a couple of textbooks from college, things that I felt I should brush up on. I wanted to see how much I retained of these subjects and how much I had forgotten that may still be useful. I chose Linear Algebra and Differential Equations by Charles G. Cullen because I remember having breezed through this subject in college without giving it my full attention. It was easy at the time, and I coasted through the course without, I thought, giving it the effort it deserved. Then I picked up Introduction to Algorithms by the famed group of professors known throughout the programming world as CLRS. I love algorithms, and I have fond memories of this course and working through this book, so I wanted to go through it again in its entirety, picking up the subjects that we had skipped in the course as well. What follows is the perspective of one programmer rereading a couple of old textbooks fifteen or so years later.

Linear Algebra and Differential Equations front coverVS.Introduction to Algorithms front cover

Linear Algebra and Differential Equations


Linear Algebra has definite applications in programming, especially when it comes to machine learning, so in addition to wanting to give this textbook its due, I wanted to brush up on the subject to hone some skills in preparation for more learning. For a math book, it's relatively slim at just under 400 pages, so I thought I could cover it in a couple of months at most. Math books can be deceiving, though. After working through the entire book, and only doing the odd-numbered problems that had answers I could refer to in the back of the book, it had taken me a full four months of effort. It ended up being a much more challenging, but in the end, satisfying endeavor than I originally expected.

The book is split up into nine chapters, with the middle of it, chapters 2-6, having the major content. The first chapter is an introduction to first-order ordinary differential equations to get the reader started thinking in terms of differential equations and to cover some of the basic definitions. Then we switch gears in chapter 2 and learn the basic notation and terminology for matrices and how systems of linear equations can be represented and solved using matrices. This material forms a basis for the rest of the book, and includes properties of matrices, determinants, and row and column operations. It's fascinating how matrices can be used to systematically solve systems of linear equations (or discover that there is no solution) in a way that translates to a straightforward algorithm that can be programmed into a computer to solve an arbitrary system automatically.

In chapter 3 we learn about vector spaces, coordinate transformations, orthogonality, and basically a number of things that would be useful in computer graphics. Chapter 4 covers eigenvalues and eigenvectors, which are used to decouple a system of linear differential equations so that the system can be solved more easily. Chapter 5 then continues exploring eigenvalues in systems with higher-order linear differential equations, and things start getting more complex. Chapter 6 builds on these ideas still further with the fundamental matrix of solutions for a system and the matrix exponential. The last few chapters cover some special topics: stability and phase plane analysis in chapter 7, the Laplace Transform in chapter 8, and solutions in series form in chapter 9.

The pace of the material and how it was arranged was well done, and I never felt overwhelmed or at a loss of understanding. Explanations were concise, but sufficient. Discussions were clear, and when it would be helpful, Cullen defined processes for how to work through solutions using a particular method. However, the experience was not without its frustrations. Real-world applications were sparse, and it was not always obvious why you would want to use a certain concept or method to solve an actual problem or how you would go about doing it in practice. Connecting theory to application was sometimes lacking. It would have been helpful to ground concepts in real applications more thoroughly, especially later on with things like the matrix exponential. I'm still not quite sure why I would want to use that.

Even worse than the linear-algebra-for-the-sake-of-it parts, was the constant mistakes throughout the text. These mistakes were not simply typos, either. Definitions would be wrong or omit important things that made understanding more difficult. Proofs would have mistakes in the derivation that would leave me confused about the steps that were taken, until I figured out that there was an incorrect variable or an inverted sign or a missing term that, when corrected, made a lot more sense. There were even entire duplicate problems in different problem sections, and some of the solutions were obviously wrong, too. The worst type of mistake was when subscripts were swapped in the matrix notation. Reading matrix element subscripts and understanding how they relate to positions in the matrix is mind-bending enough as it is, but when you also have to question whether they're correct or not, it can really throw you off track for a while.

While the material and organization was decent, I can't recommend this book to anyone learning linear algebra simply because of the pervasiveness of the mistakes. There must be a better option when it comes to linear algebra textbooks, and I'm frankly surprised that my professor had chosen this one for the class. I'm almost tempted to look for a better book, but I lack the motivation to do so. I've already gone through the material so it wouldn't add too much to my understanding of linear algebra. This textbook served its purpose for me, even if it was a little more frustrating than it could have been.

Introduction to Algorithms (CLRS)


Data structures and algorithms hold a special place in my programmer's heart. I've always loved learning about and working out more efficient ways to solve programming problems, and as I said, I have fond memories of the algorithms course I took in college that used this textbook: CS 577. Lately I've felt like my knowledge of algorithms has been slipping, so I wanted to refresh my memory by working through this book again. Additionally, we didn't cover the entire book in that semester course, so I wanted to read through the stuff we had skipped the first go around.

I was surprised by this second read of CLRS in a number of ways. First, I couldn't believe the enormous amount of work it took to get through. I knew it was a big book. I could see that, plain as day. It's over a thousand pages plus appendices, for heaven's sake. However, I had forgotten that each chapter was split up into sections that were each 5-10 pages, followed by a set of problems that could take an hour or more (sometimes much more) to solve. I'm sure we didn't do all the problems for any given section during the course, but I tried my best to do them all this time around. It was tough, to say the least.

The second surprise was that the book was, for me, split up into four basic parts: tedious, interesting, frustrating, and challenging. The tedious part was the first dozen chapters of stuff that I remembered pretty well and of which I still had a fairly good grasp. These topics were algorithmic complexity, sorting, searching, elementary data structures, hash tables, and binary search trees. Next came the interesting stuff that I had a vague recollection of, but really enjoyed relearning. These concepts included red-black trees, dynamic programming, greedy algorithms, binomial heaps, and all of the graph algorithms. I was reading this stuff around the same time I was writing the series on simple game algorithms, so the graph algorithms were especially relevant to that project.

After the good stuff came the "selected topics" part of the text, which I thought I would enjoy, but it turned out to be maddeningly frustrating because of the cursory coverage of these topics. Most of them didn't seem to add much value to the book at all. Sorting networks was a complete waste, and was replaced with multi-threaded algorithms in the third edition of the book. Matrix operations, linear programming, number-theoretic algorithms, string matching, and computational geometry were all poorly covered because they each really require their own books to do them justice. Most of the time in each chapter was spent inadequately introducing entirely new notation just to be able to talk about the topics, so it took a lot of effort to grok all of the notation while at the same time it felt rushed and incomplete. It would have been better to pick one of those topics, give it four or five chapters like the other parts of the book, and leave the rest out.

The last couple chapters on NP-completeness and approximation algorithms for NP-complete problems were a good end to the book. They were challenging, thought-provoking, and nicely tied in a lot of the material from the rest of the book. Like the interesting middle chapters, I found these chapters a worthwhile read. What I should have done, and would do in a future read, is read chapters 13-26, 34, and 35 while skipping the rest. Even with that set, chapter 26 on maximum flow networks was quite difficult, mostly because it was not explained nearly as well as other topics. The sometimes vague explanations were another issue I had with the book. Most topics had decent explanations with good examples to help with understanding the material being presented, but every so often, like with flow networks, the explanations and examples were seriously lacking. In those cases I found it much more difficult to understand what was going on, and how the steps of an algorithm worked in practice to find a maximum flow network. It was all the more frustrating because the rest of the text was quite clear and well-written.

The third and final surprise came from the problems. The difficulty and applicability of the problems was extremely inconsistent. Within the same set of five to ten problems there would be some that could be solved in minutes while others would take an hour or more to figure out. Sometimes I wouldn't even know where to begin with a problem and I'd have to set it aside for a day or so until a flash of inspiration came to me, and I could go back and try to solve it. Sometimes I wouldn't even know what the problem was really asking so I'd have to just make some assumptions and solve it based on those assumptions.

From a certain perspective it might be considered a good thing to struggle with the problems, and work through that struggle. Not every problem in the real world is going to be easy, so it's best to practice and develop that ability to persevere, but the problems that I struggled with the most seemed to have the least relevance to the material just presented. When I'm studying a certain topic, I would expect the problems following the section would be about that topic and would help the reader develop the skills to solve problems having to do with that topic or develop a deeper understanding of the topic. However, some of the most frustrating problems had nothing to do with the the sections they were in. For example, the last problem in section 34.2 on polynomial-time verification has the reader prove that a graph with a certain type of construction is a hamiltonian graph, yet this problem has nothing to do with polynomial-time verification or NP-completeness. It's purely a graph theory proof. If anything, it should have been in one of the graph algorithm chapters, but I'm not sure it needed to be in the book at all.

Between the inconsistent difficulty of problems and the propensity to have problem sets with proofs of tangential properties loosely related to the topic of the section rather than application problems to practice on, working through the problems was a little too frustrating much of the time. Of these two books, the linear algebra book did a much better job of collecting appropriate problems for the purpose of practice and understanding. It would have been great to have more problems that had the reader work more examples of the algorithms presented or had them develop new or improved algorithms for extensions to the problems covered in the sections. There were some of these types of problems scattered throughout the book, but they were overshadowed by the prove-this-semi-related-property type of problem.

This critique is probably a bit unfair because CLRS is meant to be a book for aspiring computer scientists, not necessarily computer programmers, and I didn't find it as useful as it could have been. Something more in the realm of applications of programming algorithms would have suited my desires better. Robert Sedgewick's Algorithms or Steven Skiena's The Algorithms Design Manual may have been much better choices for what I wanted to read, but I already had CLRS and, you know, fond memories.


It seems I've struck out with these two books and spent a lot of time doing it, but not all was lost. I did learn and relearn a lot. I gave my brain an admirable workout with challenging material for ten months or so, and now I'm ready to tackle more challenging material in the realm of machine learning. I'll still be on the lookout for more engaging books on linear algebra and algorithms in the future, too. Linear Algebra and Differential Equations had way too many mistakes and could have used some additional explanations and applications. Introduction to Algorithms could also have used more applications, and I would have preferred a book focused more on practice than theory this time around. I know those books are out there. It's simply a matter of finding the right book at the right time that happens to meet you where you are. Live and learn.