
In my experience teaching at UBC and ULaval, the first message a student post in a forum; the first assignment delivered, tells me if this student has the potential to become a good analyst or not. Does he or she show a structured approach to problem solving? Does he/she go beyond the obvious?

"A client is to me a mere unit - a factor in a problem."
Sherlock Holmes
Evidence I: broad and useless
At the eMetrics Marketing Optimization Summit in France, one of the presentations felt for this common error: reporting page views and visits rather than bringing insight on business outcomes based on each of the customer lifecycle stages. Without being too harsh, I would say it's a start... but it is far preferable to have more depth than breath. Focus on a narrower area you can control and make a difference rather than report widely on metrics that you don't control and won't result in any business change.Evidence II: pulling the wrong lever

Case study: Students were asked to do an analysis of the Visits to SaveTheChildren.org in January 2010, just when the Haiti earthquake stroke. Most students did not mention the traffic was 14 to 15 times higher than usual. Some did not realize the statistical shock was a direct result of increased awareness of Save The Children resulting from the disaster.
Evidence III: chasing the monkey
Most analysts are feeding from "Top 10 tricks to improve SEO", "Top 5 social media tactics", "Top 3 wonders of the online world"... Good for the "now" part of learning, and maybe a necessary step, but in the long run it's not how you will transform the business. This is what I covered in a previous post entitled "Undermining our future as web analysts".Elementary my dear Watson
