Why I decided to take the Digital Marketing Nano-Degree Program?

If I step in my “way-back” machine, I could list the goofy asymmetric hairstyles, the fashion faux-pas and analog habits of creatures which by today’s standards would be laughable, unrecognizable to most.   But there was a time when computers were as primitive as rocks in a pile and Digital was nothing more than the itch of a few visionaries like Steve Jobs, attempting to push society into the next century, the way Henry Ford brought us from horse-buggies to Model T cars.

During that period I was a kid whose eyes widened with dreams of machine automation, wise-cracking voice assistants and flying, autonomous cars.  I used to code simple games and trade code hints with my friends in my underwear in all night hackathons, smelling of pizza and root beer.  But as much as I wanted to be the next Steve Jobs, in my youth,  I caved into the pressures of the “real world” after college and sought more practical devices to make ends meet and sustain a livable lifestyle.  I never strayed too far from my digital roots, sticking to start-up companies that were a part of the technology revolution.  However, as content creation and distribution, changed with companies like YouTube, Google, Facebook, and Netflix, I found my skillsets as a marketer age and I constantly would have to reinvent myself. It was no longer good enough to have a good idea, one had to execute, and publish quickly on new social platforms with analytical skills, data first and A/B testing mindset.

I decided to return to school and get an MBA in Strategy, Operations and Technology.  I took courses in product management and brand marketing to delve deeper on how brands create, launch, distribute products and market to customers. This was very good move for me in terms of general business skill upkeep.   I joined the next generation of business managers riding the digital economy.  However, when it comes to specific tactics relevant to be an effective Digital Marketer there continues to be ongoing gaps because the practices and audiences in the digital age are so fluid and constantly changing.

The Digital Marketing Nano-Degree Program is a great bridge to ongoing  digital learning where I have access to personalized curriculum, case studies, projects and leading experts in the game.   I cannot lie.  Sometimes I feel old on these forums, or outdated against these digital natives, who grew up with cell phones instead of a baby pacifier.  I  already have a job that is time consuming and there is never enough time to learn it all.

But you don’t have to be Steve Jobs, to succeed at this program.  You just have to dream and commit to getting better. What’s stopping you from signing up now? You got dis!

 

 

 

The Commercial Future: Artificial Intelligence

spark-of-lifeIn this blog, I am interviewing guest Nikolai Pereira, an entrepreneur and founding partner of Leova, an AI system that voice-enables mobile apps.  He is an algorithm junkie, computer science and electronics engineer expert with direct knowledge on the future of Big Data and Artificial Intelligence. Let’s begin:

SR: Many of us have a notion of Artificial Intelligence as a giant leap into Science Fiction, a cold, ominous voice echoing in the chambers of a satellite station, omnipresent, omniscient.  What is it exactly? Are we there yet?

NP: I’ve gotta admit, the movies are pretty frightening. The thought of a HAL-like entity watching me coldly, from my ‘smart washing machine’ haunted my younger years. But reality is pretty far removed. This fear originates from our tendency to project our human selves onto machines. As humans, it is our genetic programming that orients and drives us.

I’d like to subject you to an example: Ralph and Macy are co-workers in a company where dating your co-worker is prohibited. Irrespective of that, they may still choose to go out with each other. That’s because office directives are ‘rules’, whereas their genetic directives, to procreate – is part of guiding programming that’s wired into them to ensure survival of species. Unfortunately, this survival of species has led us to where we stand in terms of unmanageable population levels in various parts of the world, uncontrolled exploitation of our planet’s resources as well as climate change.

Machines, however, are driven differently. Just like your iPhone, every program, every algorithm is given a purpose by its designer. I’ve discussed this in great detail in a post on my blog: http://17e.in/mD . The power steering in your car is driven by a singular purpose – to make your steering experience more convenient, within the constraints of safety, et al. At no point in time can it change its ‘mind’ about what it does and become a music system (if that example sounds absurd, it is – but it’s also extremely analogous to the possibility of a machine behaving any way other than the way it was programmed).

So what we’re projecting onto machines is the fear that they might behave like us, and then compete with us for natural resources. If you’re worried about machines spontaneously deciding to do this, then have no fear. If you’re worried about a human being programming them to procreate like us and compete for our resources, well, then you’re onto something. If you’re worried about some hacker breaking into the control system of your car and disabling the brakes, you’re not paranoid. These are real – but they’re not caused by the AI going nuts. But then again, computer viruses have behaved in that exact same way for the past 20 years or so; and we worry less about them every day than we ever have before. Remember, as a species, we’re supremely qualified to survive almost everything that might come our way.

SR: How different is it than Big Data which is a concept that has floated around for a long time?

NP:  Big Data is completely something else; and is related to AI in only a supplementary way. Big Data is the ability to process large amounts of data and then derive some insight from them. Companies have been collecting data about their customers for a long long long time; and it’s only recently that the tools to process and sort this data has emerged into affordability and accessibility. While Big Data offers amazing possibilities, it also offers a disappointing insight; that maybe we’re not as unique as we’d like to think we are; and that our preferences resemble that of many other people. And that a lot of our actions are predictable based on the previous actions of people who could be grouped into our demographic.

For example, when you search for X movie on NetFlix and NetFlix offers you recommendations to watch Y and Z, based on what you’ve watched, that’s big data processing in action. And those recommendations come from NetFlix seeing that other people who’ve watched X also tend to watch Y and Z. And more often than not, NetFlix is usually right – I do want to watch movies Y and Z.

Big Data is being used to empower decision making everywhere. Companies like GoodData and Mu Sigma crunch data to provide their customers with insights from this data. An example of this as a value add to a customer like Target could be to identify buying habits of a particular demographic (such as college students) and then cluster their frequently bought products contiguously in a store so that they don’t forget to buy something. This makes the customer more likely to have higher dollar totals at checkout. Big Data is used to make commercial decisions about many things – and as the ability to crunch big data becomes more commonly available, it will be used by ordinary mortals to make better decisions regarding their daily lives and well-being.

SR: What are commercial applications that we can see in use of AI today?  The other day I was joking that buying a phone will be analogous to interviewing for a personal assistant. Voice, tone, readiness will be as important as accuracy of data.  Right now Siri, Google Now and Cordova lead the pack.  What am I missing?

NP: Actually, right from your washing machine, which makes decisions about how much water to fill in its drum, all the way to ‘smart’ thermostats in the office place to Google Ads being served to you – AI is ubiquitous. Associating it with an interface (such as voice or a blinking red light in the case of HAL9000) is limiting and takes away from how omnipresent ‘smart decisions being made by machines’ are.

As time goes by, and computing power becomes cheaper and more affordable to inventors and scientists, you will see more visible manifestations of AI in our world. A fantastic case of AI showing off its computing chops is in Google’s self-driving car. This car (actually a set of roughly 25 cars) has driven over 1 million autonomous miles across the country, without a single accident that wasn’t caused by external human error. I could argue that if all cars were switched to self-driving technology, with zero human interference, our grand-children may never know a road accident in all their lives.

Right now, the most obvious and in-your-face use of AI would be Google Now, which uses AI, often coupled with big data, to try to predict the rest of your data and show you information related to your schedule. Siri isn’t really AI, except in the most rudimentary sense of the term.

SR: It seems technology is seldom consumed in one serving.  Different parts come together over time to solve a problem.  For example the internet was the marriage of PC, telephone, and browser technology advances over a period of 30 years.  What needs to come together for AI to accelerate user adoption?

NP:  I’d actually say that we’re all already using AI in one way or another. If you’re not consuming it directly, that’s because AI-output, in its current state and form is error-prone, and best consumed and ‘cleaned’ before being sent to the end-user.

SR: Talk to me about your product, Leova and how it can crack the AI riddle?

NP:  Well, Leova is an algorithm that is able to understand naturally spoken language. This is a big step forward, because Leova doesn’t need you to use or memorize keywords; and its ability to manage a conversation is a big big deal. To explain what this means: Leova’s travel implementation allows you to ask for a flight in your first sentence; and then in the second sentence you can change the date of the flight; and then you can change the destination city; and then you can tack on a return flight. We spent a lot of time studying human interactions to arrive at an understanding of how human beings talk to each other before attempting to build a system that is able to handle interactions of that nature. All our work can be boiled down to hard science and our ability to understand spoken human interaction.

SR: It was great talking with you! Good luck.  I am excited about your new algorithm!

Wearable Free-Conomics

030409_freeconomics1Warren Buffet said he doesn’t invest in technology because he doesn’t understand it.

What if I engaged the mighty sage of Omaha on the future of wearable computing like smart watches and Google glasses.

Fa-get about it!

But what company does not have some form of technology enabling it or disrupting its industry.  For instance, 3D printing  reducing the hassle of logistics and inventory management in manufacturing.  Drones swooping down in highly dense metros replacing bike couriers for mail delivery.  Self-checkout kiosks displacing retail clerks to consummate a sale. And it’s only getting worse.

The Buffet idea that a firm can permanently stay at arm’s length from its competition because of a durable, competitive advantage is a thing of the past, a bone of analog.  Any company that is not embracing digitization and developing “spider thinking” is a thing of the past.  The wave of internet competitors angling for domination across retail, telecommunications, CPG and entertainment sectors is far too numerous and  aggressive (more than the dotcom era) to stumble incredulously over stones of denial or hesitate.  He who hesitates to understand digital and exploit leverage will lose.

In the same manner many sports analysts argue that the game of basketball has transformed from one of pure skill and intelligence(Larry Bird/Magic Johnson) to speed, size and athleticism (Lebron James), business has changed.  Robust, proprietary algorithms with a slick, agile interface that delights customers while attacking pressure points in the value chain are table stakes.  The crossover dribble AI (Allan Iverson) popularized in sneaker ads and on NBA courts is not fancy “street ball” anymore; its fundamental basketball.  Similarly, profitability itself does not entitle firms to sustainability.  Rather, the way a company profits by developing an unfair advantage, leveraging the compounding power of network effects, does.  And so digital is the new “crossover” that is fundamental to business.   As a result, digital victors must find the right business model that can plug-and-play their products/services with other providers, allowing for flexibility and sufficient scaling with environmental changes.

With that said, one does not need to be a sage like Warren Buffet to see that the future of wearable technology is in need of a business model upgrade or mass adoption will not spread.  Wearables 1.0 was about inventing technologies. Wearables 2.0 has to be about architecting rich business models.

For example, while Google Glass enjoyed first-to-market success, consumer sales languished because critics claimed it was too expensive @ $1500, aesthetically un-wearable  and it lacked real world utility – with too few apps over its unique mobile operating system. Voice commands like “Ok Glass” could not prevent the Google Glass Explorer Program from shutting down operations.  Even Astro Teller, head of Google X Labs admitted to a Vanity Fair audience in Fall 2014,  wearables have to drop in price by a significant factor to get an uptick of demand and go mainstream.   Likewise, Apple launched its first wearable technology, the Apple Watch, in early Spring 2015 tallying up $1 Billion in 2 weeks of pre-orders.  However Slice Research is claiming that the halo of initial sales has dimmed recently and plunged by 90%, dropping from 200K units to 20K units per day.

For business models to work effectively they have to focus on the who, the what and the how.   Who are the paying customers for wearables? What are wearables and are they a good product with no superior alternatives? How are wearables  purchased through a distribution channel like retail stores or online?  If the first question is not answerable, it is very hard to stand a business model up successfully.  A Harris survey found that 59% of Americans don’t get the need for wearable technology.  It seems both Apple and Google have answered the what-questions and the how-questions, but the who-questions remain open, unrepentant.  What’s worse is that,  according to research from Endeavor Partners, 1/3 of people who buy wearables stop wearing them within 6 months.

The main challenge for market penetration of wearable tech is that it doesn’t provide utility.  The killer app, telling time, was figured out hundreds of years ago starting with the sundial and the smartphone gently alerts us of emails and texts with a buzz in our pocket.

Wearable manufacturers will have to lower prices with cheaper versions for breakthrough sales to ensue.  Today, wearable manufacturers are applying a cost plus model where they are sourcing parts at cost and charging a markup.  But they are pricing wearable tech out of the market for the average consumer.  If Moore’s Law continues to hold up, the cost to produce batteries with a longer life and minimize component sizes will go down.   Before then there are a few potential approaches to fix this broken business model.

  1. Upfront carrier subsidies are a possibility. Similar to smartphones, smartwatches could be lowered to $100, if the customer signed a two-year contract.  This means the carrier will cover a portion of the device cost and recoup the loss through data service revenue.
  2. Offering these wearable goods for free. It  will not make manufacturers happy as it will reduce the perceived value, but networks can wholesale device and recoup cost by selling the data on the phone to marketers.
  3. Offering these wearable goods for free. And offset the cost through mobile advertising personalization. As  devices get more intimate consumers may be put off but its viable method of making money.  This is how Google derives a large portion of its revenue.
  4. Freemium.  Basic wearable tech data for free.  Advanced services for premium charge.

Warren Buffet,  once said “There is no such thing as a free lunch.”  However, wearables may require a new way of thinking and investing in Free-conomics.  With online there is such thing as a free lunch.  Much of free-conomics is based on the idea of dispensing free content to build audiences and sell them to advertisers. Freemium is part free basic services and part charge for advanced services.  Because the marginal cost to offer digital services approaches zero, it is ok if 99% of customers hog the service for free as long as 1% are willing to pay for the “premium version”. In the case of wearables there may be an opportunity for basic access for free but offer premium subscription to more advanced options.

There is an intimacy to how it can serve a customer seamlessly, built-in to clothing off the rack, rather than as a glass slab you swipe dozens of times with the pads of our fingers to get something.  The advantage of computing worn on the body is that it can deliver data that is custom, context-aware, which once consumers understand, they would be willing to pay for. There are a buffet of business models for wearables.  Wearable tech startup must artfully blend manufacturing, pricing, ongoing demand-generation and distribution with the role of data, users and metrics. This will be crucial for the market to justify buying it.

 

ATAWADATAH

wadatahThe first time someone said “Atawad” to me, I retorted:

“Don’t wa-da-tah to the shama cow… ’cause thats a cama cama leepa-chaiii, sa-da-tay, dig?”

I thought this person was trying to one-up my knowledge on Pootie Tang slang.  Turns out he was talking digital code for on-demand, the future of business models.

Atawad stands for anytime, anywhere, any device.  It describes how people consume video in a multitude of impatient contexts.

For example, VCR is Atawad. DVR is Atawad.  IPTV is Atawad.  All these platforms allow viewers to time-shift and watch content when they want rather than on a mechanized, broadcast schedule.

But to extend the concept beyond video-on-demand, atawad suggests a new mantra to sound out loudly for daily meditation or magically unlock the cave of venture capital treasure; mobile driven business models that provide keys to instant gratification and real-time fulfillment.

In the old world of analog, businesses could get away with maximizing profit from passive customers with standardized offerings.  In the new digi-universe, businesses must be purposeful brands offering personalized, on demand and shared services.

Uber is a great example of a disruptive, on-demand model that solved a real problem: hailing a taxi in a maddening city like New York.  One’s voice drowned out amongst millions of people.  Variable cold weather.  Implicit bias based on one’s physical look.  Poor service because of poor dispatch infrastructure, poorly trained cab drivers.

Uber saw the car-sharing opportunity at the bottom of the pyramid, where the crowd is the company.  The crowd is a living organism that already connects instantly  through social networks.  Therefore, the value proposition is moving idle goods quickly through the crowd in cost-effective ways  vs. the slow, sloppy execution of monopoly cab services.  All Uber had to do was algorithmically layer a digital mesh on that existing infrastructure, integrating online and offline, making the business model simple.  Empower the crowd to bypass inefficient institutions and get what they need from each other.  Aggregate and organize surplus labor that has time but no money to help those that have money. but no time.

Zipcar, Airbnb, Rent the Runway are more examples of platforms creating multi-sided markets that allow services and customers to interact more smartly. No need to reinvent the wheel. Re-use and recycle downtime assets while lowering the cost of ownership.   Soon everyone will get a ride, get groceries, try on clothing even get their new mobile devices through a concierge service, without having to be stuck in a contract.  Even Sprint, a big 4 mobile carrier, recently announced Direct 2 You, a premium service for in- home delivery and set up of phone for customers.

Whether a start-up or legacy enterprise, if your business does not incorporate big data, personalization, or co-creation , you will not “Penna tine on da damie kays.”

 

I Love It When U Call Me Big Da-Ta

BigData3jpgIn the 90s, Hip Hop was divided along very crude lines between Biggie Smalls and Tupac Shakur.

One could not love both; one had to pick sides, coordinate the right bandana colors and wave the right gang signs or one’s street cred as a hip hop enthusiast was shot.

20 years later the debate seems trivial, dying in the glowing embers of a purist’s memory.

Both men are dead. (If you don’t buy that 2Pac is on a beach in Cuba smoking a cigar!)

To claim West coast or East coast victory at this point seems outdated, especially when the champion of that rap battle is Big Data.  For example, if ‘rap greatest’  was defined by having the largest unique vocabulary, Wu Tang would have a bigger vocabulary at 5,895 words than Shakespeare(who has a bigger vocab than Biggie and 2Pac) at 5,170 words.  But Big Data has a bigger vocabulary than Wu-Tang at infinity.

Because the internet has erased geographic borders and time zones, Big Data’s ubiquity and mobility dwarfs the loudest hip hop historians.

Big Data is a statistical study and technology aggregation of massive quantities of information on anything. It is not corporate analysis of structured, transactional items found in ERP, CRM and SCM tools.  It is the joint analysis of structured and unstructured data within a company, with external data sources integrated as well.  It could include comments on Facebook, mp3s of rap music, thousands of emails, collaboration activity on SharePoint, web page content, server logs, stock market tickers, GPS imagery, car telemetry, etc.  It is so hot-messy, voluminous and fast that conventional database systems and architectures cannot process or tame it.  Valuable patterns of user insight and hidden sentiment lay within it, but the feasibility to extract it can be costly.  Today’s commodity cloud architectures, open source software (like Hadoop) and API plug-ins make it possible to prep Big Data for analytics, improving operations, reducing defects or developing new products.

All companies have to do is rent server-time in the cloud to swim in the Big Data flood and learn.  Netflix and Amazon have applied Big Data in elegant ways with their recommendation engines based on prior purchase habits.   Facebook combs posts and combines user signals to friend’s responses to supply highly personalized customer journeys and advertising models for businesses.

However, that dataset consistency often deteriorates between physical and digital brand channels.  How many times have you seen something cool on a mobile site, then walked into the store and the staff does not have the foggiest about what you’re talking?  It is because their organizations and systems are not designed that way.  Ecommerce platforms tracks cookies and pixels.  And reps at retail establishments or bouncers at a club are not going to check your iPhone browser history.  Due to a historical focus on platforms, rather than functions,  businesses will build an in-store point of sale system, then a website and then a mobile app, all with different architectures.  Making Big Data a conundrum: the more we know about it, the less we know about it.

Thus, in order to beat Big Data, companies must embrace data analytics as a core competency or be in survival mode.   When Biggie and 2Pac grabbed the mike 1) message  2) flow 3) delivery 4) technique 5) beats/production was all you needed to be considered a good MC.   Today, I would argue that smart MCs need to be data literate also, not only English-literate.  Mckinsey & Co forecasted a shortage of at least 1.5M people with data skills by 2015.  In the coming decade,  statistics and probabilities will be the new in-demand languages to dismantle the volume, velocity, veracity and variety of Big Data.  Only those who reprogram and reskill themselves for an analytic future should be crowned title of “greatest of all time.”

A/B or Not To Be

Decisions, decisions, decisions…

Would that we could reduce the noise and complexity of our lives with a TV remote and dial everything down to the signal of the critical few?

In the 90’s there was this movie called “Sliding Doors”, starring Gwyneth Paltrow. THE PLOT: A British woman’s personal romance and career both hinge (without her knowing) on whether or not she catches a train. The trigger of events to come is nothing more than a sliding door; A) what her life would have been if she got into the sliding door and made the train and B) what her life would been if she didn’t. And we watch both scenarios unravel, in parallel.

“Sliding Doors” explored the idea of parallel realities in philosophical terms. What happens when we as individuals consciously choose one path over another? Door #1 or door #2 (or #10)? But in business making the wrong choice can have negative impacts to the bottom line.

Which is why if you are not an A/B organization, you are not an agile or smart organization. If you’re not measuring, you’re not marketing.

A/B testing starts with the assumption that there is no truth. Using the technique of hypothesis testing, it is analogous to the US legal system, where one is innocent until proven guilty. Things are false or not false and the burden of proof resides with the accuser. As you build evidence you either reject your default position/status quo or do not reject it. Therefore in order to self-optimize, one must keep testing their assumptions and improve the outcomes iteratively.

A/B splitting is akin to consumer focus group. Different variations of content are run against a baseline control. Direct Mail campaigns were the first form of A/B, where variants of the same offer were presented to target groups, response rates measured. But on the web, it takes on an accelerated level. The web gives instantaneous feedback and allows you to experiment at zero marginal cost. When tests trend positive, A/B platforms like Google Analytics or Adobe’s Test and Target, allow you to disable the control group with one click so the content that performs best is immediately exposed to all site visitors. You can geo target, behavioral target, weather base target not just on your site but in email campaigns.

Advertisers and marketers are able to turn on a dime and make adjustments within minutes and generate significant ROI. There are so many things in life we want to do over again. That swing at bat. That curse word you hurled at your lover. With A/B you don’t have to go down the rabbit hole. Fail first, Fail fast, fail often. A/B all day every day.

Me & Steve

The other day, I asked my wife, “If the version of me today ran into the version of me 10 years ago on the street, would the version of me 10 years ago like the version of me today?

My wife said, “Yes, because you became the big brother you always wanted back then but didn’t have?”

When I was younger, I used to be a heavy programmer.  No joke.  I made video games in Basic A or Pascal and then converted them to Assembly Language on old computers my parents got from work. These games were rudimentary, on par with Space Invaders or Pong (pre-internet, pre-html and Flash animation) but amusing no less.   I would take the bus home from school with a long print out of computer code in my hand.  After eating dinner, I would stay up late studying “for –next “ loops and “if…then” conditional statements and Boolean logic, fixing some “bug” or analyzing a snippet of code not rendering graphical animation like my Nintendo games did.   For a hot second, one could have confused me for an extra in the movie, “Revenge of the Nerds.”

At the time, Steve Jobs had left Apple and was releasing his latest concoction called NEXT, which was supposed to revolutionize the computer world yet again.  The famous Steve Jobs, who had brought the first PC to the market with the inimitable multicolored Apple logo.  I didn’t know much else about the computer world (like the software revolution that Microsoft was waging) but I did know the long haired, jean wearing Steve Jobs and I knew I wanted to be like him; the magician and futurist  able to make people fall in love with a mouse and a small screen.

I don’t know what it was about him.  Maybe it was the fact that he wore a black turtle neck all the time, which made him the “blackest white man” I ever saw in business and technology in Silicon Valley.  And so I could relate to him, even look up to him as a big brother.

Then his next big act, NeXT, hit the market…and it flopped.  It was not a mainstream success like Apple II.   And most of the 90s Steve Jobs was written off and relegated to the background of  Bill Gates’ dominance with Microsoft.  Ironically, as I got older my interests began to partition and other passions took over, far from computers and the world of tech.  In a way, I had no big brother to emulate in that world and so that world became less relevant and I got into other things.

For example, the version of my 10 years ago was greatly into film, music, and poetry.  If I told the version of myself 10 years ago that in ten years he would be getting an MBA and pursuing a career in technology and marketing.  The version of me ten years ago would have been like “What are you on, are you crazy?”

This brings me to my main point about Steve:

A month ago I was trying to engage this smart college kid(with an engineering background) in a conversation about bridging the digital divide in inner-city communities by getting Fortune 500 companies to invest technology and entrepreneurship resources in these communities.  And this kid looked at me baffled.  He asked me, “Why would Fortune 500 companies invest in inner-city kids when these kids have not demonstrated any history of ability for success in technology, engineering, entrepreneurship and Silicon Valley? “

Fair point… but then I thought about Steve Job’s personal background — He was born out of wedlock.  He was given up by his parents.  He dropped out of college after one year and when he was in college he took an art class.  AND he did not have an engineering background like this kid. In fact Steve Job’s background is more similar to these inner-city kids than this college kid. Yet Steve Jobs changed the world, flipped the business rules for consumer products, and took Apple from the brink of bankruptcy to the richest, most innovative company in the world in 2010.

Most people like to remind talented people not to forget them if they “make it.”  But the reverse holds true.  People should not forget talented people if they don’t “make it.” There is talent in under-served communities that don’t “make it.”

In honoring Steve Jobs, I would like to honor the times he was not as successful during his NeXT years, because it was that time when he was like a big brother to me.  His ability to rise again from the ashes with Ipod, Iphone, and Ipad remind me still that anything is possible and that I am not as crazy as that college kid made me feel.  I hope that in my journey to aid non-traditional path-makers with my personal story –Street Poet to MBA– I can continue to be the inspiration to others that Steve was for me…

Staying hungry, staying foolish

Letter to myself as First Year MBA

You can feel the walls of Sage Hall at the Johnson School closing in as the Fall Semester begins again…a new crop of MBA students with the best intentions are about to get sucked into the vortex of over-achievement: club officer role competition, speed networking, Fortune 50 recruitment, and finally the “Core” curriculum where you are expected to absorb an incredible amount of case studies and quantitative material right from the get-go. First set of quizzes happen 7:30am in the morning and afterward everyone is wigging out because they are so used to academically crushing peer competition, but it’s different now. Later in the day, everyone will stampede their mailboxes to get the results of their quizzes. An elite few will be samba dancing as if they are on streets just liberated from a national dictator. However, most 1st year students will be peeling their face off those same liberated streets, because they don’t know what hit them.

I was one of those 1st year MBA students last fall semester stressed out from the challenges of all the above, peeling my face off the curb. The “Core” was a beast and frustrating because at the time I did not know the relevancy of half the things I was learning. (For those reading and not fully grasping difficulty of “Core” curriculum at Johnson, think how hard it would be to ride a bicycle while building it. You’ve got no seat, no handle bars, but you’re peddling, because if you don’t you will fall/fail.)

After the summer and a wonderful internship experience, I have a modified, more upbeat perspective. I more fully understand the importance of all those classes I learned during the “Core.” The MBA is less a silver bullet with a CEO nameplate and a six-figure salary. Rather, it is an immersion which expands your toolkit, your bag of tricks in analysis and managerial decision-making. There were situations at my internship where I did not know how to technically act on a problem. But I was able to illuminate where we should start and create a road-map. For example, there was one situation where I was doing A/B and multivariate testing and I had to determine whether the results we were getting had statistical confidence. I was able to ask the right questions and organize resources like my Statistics Professor and get clarification on the null hypothesis and the difference between H1 and H0, derive the formula for the p-value and make a recommendation to my team.

Or there was another instance when I was talking to a mentor within the company who worked in corporate strategy and I referenced running a DCF model in order to evaluate whether or not an investment should be green lighted based on the company’s weighted average cost of capital(WACC). When I was taking statistics and corporate finance the previous fall, I seriously thought I was wasting time because I would not need these things in digital business strategy and leadership training. However, in all of these situations at my summer internship I was able to engage my peers dynamically and add value by solving their particular problem. Over time, I recognized the more one is able to be in situations where they are a problem solver within a group paralyzed by uncertainty, the more they are perceived as a leader. And if there are enough situations where you are solving problems and are perceived as a leader, you will become the leader. That’s the power of the MBA.

And so what the MBA does is not give you all the answers but it boosts your self-confidence. You approach uncertainty almost thumbing your nose with Bruce Lee-like arrogance, because you know there is a solution to everything and you have the capacity to make it happen.

My best piece of advice to first year students is “Do not expect to get it all.” You won’t. The goal of the “Core” of any top MBA program is to dunk you into a pool of information up to your eyeballs, almost until you drown and have near-death experiences. These programs rock your confidence on purpose to make you stronger and better equipped in the real world.

An entrepreneurship expert once told me that venture capitalists aggressively scout out and place a special premium on those who have had near death experiences or been through 12 step programs. The reason is because VCs know the person who nearly dies and lives to tell the tale is the person who is committed to the success of the venture even when there is high risk and uncertainty. That is the person who VC’s put their money on. So don’t worry! I am certain you will be alright!