Sunday, June 22, 2014

Module 3: Week 5

It's apparent that many organizations from all types of industries have been doing their homework with respect to the internet and web usage trends of consumers. In retrospect, it's actually pretty impressive that we have the tools and techniques available to analyze and understand the way our targeted consumers use web resources. After all, web-access may not even be completely ubiquitous in our own country. Where I work and live (in SW Montana), there is still a decent portion of the population that could be considered non-Internet using.

According to a Pew Research Center survey (February 27, 2014), 87% of Americans are considered Internet users. This combined with the age of the publicly accessible Internet being only 25 years, our ability to perform useful analysis on usage is actually pretty amazing. This is especially true when you consider that the 90s and early 2000s were really just the introduction to the Internet "with training wheels" for many Americans via AOL and CompuServe. So looking at internet usage as being a primary means for conducting marketing efforts, communication efforts, sales efforts, collaboration and social interaction (although AOL did provide this), the industry may really only be 10 to 15 years old.

The point is really that compared to other industries, it appears that the methods for analyzing web-interaction and usage have advanced as quickly as web-technology itself. This is amazing.

After digging into to this week's materials and assignment, it became clear however, that just because we have tools available, not everyone is ready to realize the power that can be had by using them properly. For my assignment, I chose to use my employer, the University of Montana's web-site. I requested access to our Google Analytics account, and quickly realized how ineffective we are at arming ourselves with information harvested from such a tool.

Interestingly, the Google Analytics account is "owned" by my organization (Central I.T.), but there are users created for the Assistant VP for Marketing, Admissions and others. After reading the lectures and the supporting materials on Web Analytics, it was pretty clear almost right away that nobody at UM understands the data or tools well enough to realize the potential or value.

There were two things that bothered me the most about how UM isn't properly leveraging the power of Google Analytics. First, there are zero goals created. Thus, looking at how well the site is helping achieve them is futile. Second, there is no data on search. UM uses Google Custom Search as the search engine for the site, but no effort has been put into tracking how it is used. I'm not a marketing expert myself, but to me it sure seems that knowing what people search for would be useful.

The takeaway is that I was really hoping to gain some insight into how a university that has Google Analytics in its portfolio would be using it to acquire meaningful information about their targeted demographics. Sure, there is access to all of the traditional web-analytics like, unique visits, bounce rates, etc. but I was hoping to see goals, search terms-- the things that make up the "why." For a school struggling to attract students, it seems that the "why" may be the most important measurement of all.


Sunday, June 15, 2014

Module 2: Week 4

Week 4 has definitely opened my eyes to the necessity of r the BI discipline and even the Data Scientist career path. After spending the first few weeks on data-warehousing, it didn't really feel all that different than an extra layer on top of a strong knowledge of SQL and relational databases. Sure, the Star-Schema thing is a little different, and the ability to break the normalization rules is no skin off of my back (and I imagine the backs of others too).

Even after reading Lecture 8 on dash boarding, it seemed as though it would be easy to jump right into a BI tool such as Microstrategy. In fact, I was taken back by the need to take 23 quizzes on using software. The tutorials and quizzes were an easy introduction to the tool, but it wasn't until beginning the second assignment that I began to realize how important and "heavy" business intelligence really is.

Nonetheless, I dove in fumbling through the Microstrategy interface. It seemed pretty easy at first and like any other visual reporting tool (Crystal Reports comes to mind.). However, as the realization hit that I was being tasked with scouring a significant amount of data, identifying key attributes and metrics, and merging it all into something that my boss could read and even understand-- that's when it became apparent that this is no easy task.

The first question was really not that challenging with the exception of building the familiarity with the tool. Working on the second question was where it became more challenging and more indicative that either I wasn't getting Microstrategy all that well, or maybe Microstrategy is an ok tool, just not a great one.

I was able to generate custom reports that had some meaning fairly easily, and I was able to create what Microstrategy deems a dashboard/scorecard easily as well. Although, after reading Tyler's comments I questioned what I had done. After watching Hans Rosling's TED talk, it was clear that dashboarding is as much about data analysis as it is communication in general.

After working pretty intensely on Part 2 of the assignment, I really began to form the opinion that Microstrategy may not be the best tool available for serious BI work. It's possible that more time behind the wheel with it would make it seem more powerful to me, or it may be possible that the commercial version is more comprehensive. I formed this opinion, because I found the need to export data into Excel to do any analysis beyond reporting. This didn't seem so bad-- after all there was a tutorial on precisely this feature.

I imagine that someone who has been doing this for a while and that has experience with Microstrategy may be able to do some of the more hard-core analysis in the tool. I am not one of these folks, thus I used the Excel standby.

The weaknesses that I refer to are those that would help someone identify trends across data using the tool. I applied filters and ranges, etc. this was great, but it is really only beefing up the "WHERE" and "HAVING" criterion conceptually. What I was really hoping to see was a tool that could see and point out that there may be a trend in revenue amongst 41 to 60 year-old females on its own. Or how about that the state of Washington appeared to not be on the radar much at all in 2007, while ramping up revenue over 2008 and 2009.

I was able to notice a couple of these things by myself using Excel. However, I did not find this type of thing to be an obvious feature of Microstrategy.

Beyond the tool, I imaging that Cognos or Tableau may have some of these power features built in. I could be wrong too, but I hope not. Anyway, regardless of the tools, it is clear as to why we currently have such a buzz around BI and data-scientists. This is real work using sophisticated tools that require experience and a knowledge of how to present meaningful statistics derived from large amounts of real data. In addition, one would also need to have a decent amount of domain knowledge about the organization where the data is being used.

I work in a university setting where it is all about enrollment, potential enrollment, retention, graduation rates, etc. It doesn't sound all that difficult, but across multiple campuses in a system, etc.-it does require some knowledge of the paradigm, the nomenclature/terminology and the data itself. It's obvious that it differs quite a bit from revenue, etc. However, the need to forecast is equally important if not as equally hard to predict accurately.

In closing, the last couple of weeks were useful introductions in the rabbit-hole of business intelligence. I hope that I grasped some of the concepts well enough to use them effectively in my career and that I didn't totally miss the boat on trending and forecasting in Microstrategy. Hans Rosling made me feel as though I missed something completely.

Sunday, June 1, 2014

Week One


Big-Data, Web 2.0, Business-Intelligence, etc. More industry buzz-terms. But, do they really mean anything? After all, Web 2.0 really was nothing more than a buzz-term to describe a collection of practices with no actual specification. Thus, when the terms "B.I." and "Big-Data" were introduced several years ago, I had the jaded idea that they were nothing more than another buzz-term a la Web 2.0.

However, when introduced to the term, "Datafication" it really becomes apparent why understanding Big-Data and the ideas that it encompasses is so important. Sitting back and pondering about datafication is also what makes Big-Data so interesting. Through society's adoption of computer and internet-based technology, data is becoming essentially infinite. The idea that we now have enough data to virtually skip modeling and extrapolation is exciting and awe-inspiring. Add to this the term "velocity." In physics, the term velocity describes speed and also direction. When applied to Big-Data, it is apparent that this term fits, as the realm of Big-Data is fast-moving and multi-dimensional.

Business-Intelligence (BI) has a more obvious meaning, and it has been a relevant term in the industry for at least ten years. However, now that we have moved into the time of Big-Data, BI takes on a new level of importance and possibilities. Because of the datafication of our current world, developing useful BI is more challenging than ever, yet it also has the possibility when done well to be more useful and impacting than ever.

In application, what I liked about the materials presented on BI was the emphasis on predictive modeling-- it seems to be the function that many organizations overlook. Much effort is placed on building and implementing dashboarding and reporting, but it appears that successful organizations understand first what they need to report on and how to use data to predict their next move within the marketplace. This idea was reinforced by the explanation of the "Data Scientist." It's a great description of what might actually be missing in many organizations. A scientist is someone who knows how to identify a problem, identify a means for taking measurements while controlling aspects or variables within the problem to produce data, and most importantly someone who knows how to analyze and use this data to form reasonable and meaningful predictions and outcomes. Thus, the "Data Scientist" may be exactly what many organizations are looking for-- whether or not they know it yet is what is to be determined.

Many of us have worked in an organization with an emphasis on metrics. I was part of a management team trying to develop the collection of metrics to be used to assess the performance of the organization. It's amazing what occurs when ten or twelve people get together in a conference room for this discussion-- they literally want to begin measuring any and everything. Without understanding the organization's goals, or identify the problem(s) needing to be solved, measurement can become overkill. Having the skills to identify what is actually important for use by an organization is the real power. 

Thus, the sum of what was covered in Week One was encouraging in the sense that I am happy to see that this course and program's emphasis appears to be headed in a good direction. It is good to see that we are looking at these two concepts in ways that nobody really did with the incarnation of the phrase Web 2.0.