Men In Blazers PL Mood Table – How It Works

Men In Blazers PL Mood Table – How It Works

Odds are, if you’re reading this, you represent the part of the Venn diagram where my Microsoft data platform buddies and Premier League soccer fans intersect or you clicked the link in today’s Men In Blazers Raven because you were curious how the Premier League mood table works. Rest assured, if you can think of different ways to do this and tweaks to make, I’d love to hear about them in the comments (unless you’re an Arsenal supporter – mostly kidding)! I’m passionate about data solutions and am always trying to stay abreast of rapidly changing platforms and technologies. Feel free to reach out to correct me, congratulate me, or just to connect with me professionally!

Before we go any farther, I owe my buddy Bradley Ball (b|t) a tip of the cap and several cold beverages for the original blog that provided the inspiration behind the Mood Table. If you’re interested in a deeper dive than this high-level overview, click through to my deep dive post here or his original post here.

First off, the foundation of the mood table begins with creating an Azure Logic App (basically a workflow container) for each of the twenty Premier League clubs. Those logic apps contain a Twitter connector that, when active (more on that later), searches Twitter for tweets containing each club’s official Twitter handle. Those tweets are collected by the Twitter connector and fed into the Sentiment Analysis function of the Cognitive Services API found within Azure. The Sentiment Analysis function adds a sentiment score to the tweet (0 is completely negative and 1 is completely positive) and passes it on to the final couple of steps in the process.

Once the sentiment score of the tweet is calculated, the tweet and its corresponding data (text, score, originating account, date created, etc.) is fed into a loop container that separates each club handle in the tweet and connects the tweet to every club tagged within the tweet. Finally, all of this data is stored in an Azure SQL Database (basically a database in the cloud) for persistent storage, querying, and analysis.

It is important to point out that each club’s logic app runs for a 10-minute window around the end of each match. It captures roughly the last 2-3 minutes of match action as well as the 7-8 minutes immediately following the full time whistle. As Rog told me, the intention is to capture the sentiment of tweets akin to when you turn to your mates (or buddies, in ‘Merican) at the pub and offer your instant analysis on your club’s success, failure, or utter mediocrity in the match.

Before I get into a brief description of the query I use to provide this critical information to Rog and Davo, I do want to mention that there was originally a real-time component of the mood table as well. I had written a Power BI report that showed a live table throughout the matches as tweets and their data streamed into the Power BI streaming dataset. After chatting with Rog, though, that was deemed far too optimal for MIB (not to mention quite pricey for my personal Azure account) and we settled on the current sentiment information surrounding the final whistle. It may make an appearance at some point, though!

Once all this Azure coolness has done the heavy lifting, the actual query that compiles the results is fairly straightforward. It takes the average of the sentiment scores tagged to a specific club and then ranks those clubs (and their corresponding average scores) from the highest average to the lowest average. The only data it excludes is tweets that contain only an image (as the sentiment analysis cannot score images – yet). Sadly, your hilarious soccer GIFs are not yet understood by the mood table.

In a nutshell, that’s it. I look forward to comments, questions, and connections. As I mentioned, I have a much more detailed technical deep dive into that that you can read here. This has been a blast to work on and we have some potentially interesting plans for it moving forward. Finally, there is no truth to the rumor that Rog is paying the Russians to create bots to elevate Everton to the top spot. That is just the natural optimism and enthusiasm of Evertonians shining through! To the football…

Men In Blazers Premier League Mood Table – Deep Dive

Men In Blazers Premier League Mood Table – Deep Dive

Welcome nerdy GFOPs! If you’ve made it this far, you’re likely prepared for the nerdiness (and lengthy post) that you are about to dive into. While our assumption was that some GFOPs would be curious, at a high level, about how the Premier League mood table works, my hope was that some GFOPs who work in analytics, data science, or anywhere in the data platform realm would be curious enough about this to dig into what I’ve done, discuss it, and hopefully build upon it to do cool things with the mood table or in our professional work. With that greeting, let me get to a couple acknowledgements and then we’ll get nerdy.

There are two blogs that provided much of the foundation for the mood table. First, the inspiration for the mood table came from a blog written by a friend/former boss of mine that now works at Microsoft, Brad Ball (b|t). Please go check out his blog post to better understand the foundation of the mood table. The second blog that proved to be an immense help can be found here. It was written by Toon Vanhoutte. I don’t know Toon, but I owe him a beer or three for a wonderful blog post laying out, in necessary detail, how to schedule the execution of the Azure Logic Apps. Without being able to schedule the enabling and disabling of these logic apps, the mood table would likely be too expensive to run regularly. So, cheers to you, Brad and Toon! Let’s get into the guts of the mood table.

The foundation of the mood table is an Azure Logic App created for each of the 20 clubs. Because of character limits on some of the object names, I was forced to use creative abbreviations for some of the clubs. Apologies for the name below, Arsenal supporters, but I am a Spurs supporter after all! This screenshot shows what is inside each logic app – I’ll go into more detail about each component below the screenshot.

logic_app_1

The first component (labeled “When a new tweet is posted”) is the Twitter connector itself. It signs into Twitter and searches for tweets containing the club’s official Twitter handle. It passes those tweets into the second component (labeled “Detect Sentiment”). That is where the sentiment analysis actually takes place. That component analyzes the text (not images – yet) of the tweet and scores it from 0 (most negative) to 1 (most positive). The sentiment score is added to that dataset and passed to the third component – the for each loop container.

The loop container (labeled “For each 2”) parses each tweet and tags it with all clubs mentioned within the tweet. It then inserts that data into an Azure SQL Database (via the “Insert row” component in red) for storage, querying, and analysis.

When I spun up the mood table for the first time, this is all it was. There was no scheduling component and it was just streaming tweets into my Azure SQL DB during all the matches. That proved to be prohibitively expensive, so let me walk you through some gotchas before we get into the scheduling components of the mood table.

Gotchas (i.e. how to not bankrupt yourself in Azure)

Gotcha 1: Twitter Connector Does Not Die (unless you kill it)

As I first started using the Twitter connector (screenshot below) I noticed that it asked a relatively simple question: “How often do you want to check for items?” Based on my research and discussion with others, our understanding was that the connector would wake up on that defined interval, check for tweets meeting the search criteria, and then shut down again until the next interval.

logic_app_2

Unfortunately, that understanding was incorrect. The first full weekend I ran the mood table it cost several hundred dollars as the connector never stopped running because there were always tweets for it to find. Because all of this is billed at a fixed cost per operation, the cost increased exponentially throughout the weekend, especially for matches involving large clubs with supporters throughout the world. My wife was relatively unimpressed by my mistake, so I wanted to mention it here to save you from a similar fate should you begin experimenting with this.

Gotcha 2: Scheduling Logic Apps Always Works (Eventually)

After a discussion with Azure Support to clarify exactly how the connector works, I dug into trying to figure out how to schedule these logic apps so they would enable and disable on command. That would allow me to do two things: 1) capture the time intervals Rog and Davo wanted during or after the match and 2) keep the cost manageable so I would not have to sell a limb or one of my children to continue financing the mood table.

That led me to Azure Job Collections and Azure Scheduler and Toon’s wonderful blog on how to wire this up. I will not rehash his blog here, but please go read it if you want a step-by-step walkthrough of what I did. One note I will add to Toon’s blog is that, wherever he references <workflow name>, that is referring to the name of the logic app itself. It took some trial and error for me to discover that so I want to save my readers that time and effort.

My job collection contains an enable and disable job for each club’s logic app (for a total of 40 jobs). They run on set schedules (that I manually update) to capture the 2-3 minutes before the final whistle and the 7-8 minutes after the final whistle. Rog wants those instant post-match sentiments, the gut reaction you have to your club’s match as the final whistle blows.

logic_app_3

This all seemed a brilliant idea until the next set of matches rolled around and I realized that some of the disable jobs failed their initial attempt – leading to yet more money accidentally donated to Microsoft. The Azure team’s holiday party likely got a bit better as a result of my mistakes!

Luckily, you can apply retry logic to these jobs (see below). I strongly recommend this as it will save you many disapproving looks from your significant other for making it rain in the Azure bill once again.

logic_app_4.PNG

Gotcha 3: Having A Laptop At The Pub Is Dorky (use the Azure iOS app)

The first weekend or two I ran the mood table stuff I was monitoring it via a laptop while watching matches at the pub. I’m told, from reliable sources, that that made me look like a dork. It also meant that I could not reliably monitor the jobs enabling and disabling on schedule unless I was somewhere with 1) room for a laptop and 2) decent Wi-Fi.

To use an obviously hypothetical example relayed to me by a friend of mine (wink wink, nudge nudge), let’s say a big match ended in the middle of a Sunday church service and “my friend” wanted to make sure the jobs kicked off on schedule. Taking out a laptop during the service would be deemed inappropriate by my friend’s family, his priest, and likely incurs a risk of him being struck by lightning were the Good Lord to catch a glimpse of such activity. How could my friend solve this issue?

He used the Azure Portal iOS app found here (it’s also available for Android). That allowed “my friend” to ensure that the club’s logic apps enabled and disabled on schedule and that the tweets were properly captured. I’m also delighted to report that “my friend” was not struck by lightning while checking this during church.

Things Deemed Too Optimal For The Mood Table

Originally the mood table had a live component with it: a Power BI report that accepted the tweet information via a streaming dataset and showed a live table throughout the matches with up-to-the-second rankings. While it was fascinating, Rog wanted to focus on the end-of-match sentiments for the static table.

The ability to analyze streaming sentiment data in this way, however, may make an appearance for individual Premier League matches or, possibly, a major soccer tournament taking place this summer (no, not the Alternative Facts World Cup).

Conclusion

If you’ve read all the way to the end, I salute you! I’ve enjoyed working with the guys on the Premier League mood table and I’ve enjoyed putting together this deep dive post for those of you that are interested. As I mentioned, I look forward to questions or corrections (throw them in the comments) or connections via Twitter or LinkedIn. Cheers (and come on you Spurs)!

 

 

 

 

 

Coming 12/8/17 – Premier League Mood Table How-to

Coming 12/8/17 – Premier League Mood Table How-to

Greetings GFOPs (and others who have clicked on this post)! There were a few mentions on social media that I’d be publishing two blogs here this afternoon. The first was going to be a high-level, layman’s explanation for how the Premier League Mood Table I did for Men In Blazers (b|t) works and the second was going to be a deep dive into the nuts and bolts of how I built it, the mistakes I made, the ways I fixed those mistakes, and just how flipping cool Azure Logic Apps and Social Sentiment Analysis are. That second blog will form the foundation for a SQL Saturday/conference talk I intend to submit and present throughout 2018 (feel free to contact me if this interests you and your user group/conference/organization).

These blogs were originally timed with the 12/1 release of MIB’s latest Raven (their newsletter for readers who don’t listen to their podcast – register here if you’re not a subscriber) containing an interview with me, thus the handful of tweets and other social posts mentioning today as the day to take a look at my site.

Unfortunately, in typically suboptimal MIB fashion, the 12/1 Raven has been delayed until 12/8, thus Mauricio Pochettino (and I) make the sad face above. My blogs will be released in conjunction with that Raven, so we’ll see you back here next Friday. If you’re interested in soccer and/or data and/or social media analysis potentially relevant to your company, it will be worth a read. Have a great weekend and see you back here Friday, December 8!

 

Coast to Coast in the Next 30 Days!

Coast to Coast in the Next 30 Days!

It would be an understatement to say that I’m excited about this month in my SQL community life. While I have multiple submissions out to European conferences in the first half of 2018, this month’s highlights are two confirmed speaking engagements that I have: SQL Saturday Charlotte on October 14th and PASS Summit on November 3rd.

As my previous post mentioned, I used to live just south of Charlotte and haven’t been back in years, so I’m looking forward to seeing some friends in the area along with meeting and reconnecting with more #sqlfamily.

While Charlotte is going to be a great event (and you should definitely register here), the coolest thing to happen this month will be my PASS Summit speaking debut on Friday, November 3 at 11 AM local time. I’m incredibly proud to have been selected to speak at Summit and am looking forward to unveiling new elements and new demos during my “Where Should My Data Live (and Why)?” session.

This session is all about trying to open more traditional database administrator’s eyes to the opportunities that cloud platforms and technologies give them to leverage and extend their existing on-premises implementations and deployments. I look forward to sharing what I know and learning from the crowd about their own experiences so I can improve this talk in the future as I continue to speak and our data professional world continues to evolve. Hope to see you in Eastern time or Pacific time in the next 30 days!

SQL Saturday Charlotte – I’m Speaking!

SQL Saturday Charlotte – I’m Speaking!

I’m excited to announce that I’m speaking at SQL Saturday Charlotte (#683) on Saturday, October 14, 2017! I’ll be speaking on the final time slot of the day and giving a new talk of mine – “Where Should My Data Live (and Why)?”.

I’m really excited about this opportunity for a couple reasons. First, any opportunity to attend a SQL Saturday means I’m guaranteed to learn something, whether it’s a technical fact, a presenting tip, or something else. I think SQL Saturdays are, hands down, the finest free technical training available in the data professional community. Secondly, I used to live near Charlotte (Fort Mill, SC) so that weekend should be a great opportunity to get caught up with both professional colleagues and old friends who call the Charlotte area home. I haven’t been to Charlotte since PASS Summit 2013 – it will be great to get back!

Click here to register – and I can’t wait to see you at SQL Saturday Charlotte on 10/14!

 

 

T-SQL Tuesday #93: Shock and Subtlety of Sexist Interviewers

T-SQL Tuesday #93: Shock and Subtlety of Sexist Interviewers

First of all, thanks to Kendra Little (b|t) for hosting this month’s T-SQL Tuesday. This month’s topic (Interviewing Patterns & Anti-Patterns) is a great topic that’s generating a lot of interesting responses from many different perspectives. Beyond that, I’ve seen Kendra present at PASS events and various webinars and she presents deep technical content in a very engaging way. Definitely check out her blog and follow her on Twitter!

As for my submission to this month’s blog party, I was excited to cover the original topic I had for this post – “Interviews Aren’t Trivia Contests”. I may still blog that at some point, as I believe I’ve definitely improved as an interviewer and would like to pass along some things I’ve learned the hard way so you don’t make the same mistakes I have.

That said, a couple of conversations I had at SQL Saturday Louisville this weekend changed my mind on my post for this edition of T-SQL Tuesday. Hearing women discuss the subtle and overt sexism that they have to deal with in IT is always jarring and it prompted me to relate an interview story from my wife. Even though my wife has been in technical fields her entire career and I’ve both managed and worked alongside women in IT, hearing these types of stories is always jarring, upsetting, and thought-provoking. This post is most certainly about a couple of interviewing anti-patterns, as you’ll see below.

While my wife is currently in IT (she is a PMP-certified project manager specializing in software delivery and implementation), this story dates from 2012 when she was interviewing for a ceramic and materials engineering position in the Midwest. She has a B.S. in Ceramic Engineering, an M.S. in Materials Science and Engineering, and has her name on at least one patent and multiple academic papers. Long story short, she was indisputably qualified for the position for which she was interviewing.

Anti-Pattern 1: Subtle Sexism

This interview, as many others are, was a series of one-on-one meetings with folks in HR, on the technical side, and points in-between. There had been nothing particularly noteworthy until she interviewed with a guy in a very senior technical position. After a few minutes, he asked my wife a seemingly innocuous, albeit cringe-worthy, question (but more on that in a bit): “how does a woman get into engineering?”. My wife explained her interest in and aptitude for math and then a little more about what drove her specifically towards materials science and engineering.

Anti-Pattern 2: Shocking Sexism

His response was “interesting, most women who get into engineering are more flat-chested.” This is the part of the blog where you should hear a record-scratching noise in your head as you’re shocked by what you just read. Sadly, the few times I’ve relayed this story over the years the women I tell are not nearly as surprised as the men I tell. It goes without saying that this is an interviewing anti-pattern of the highest order. It’s sexist, demeaning, crude, lawsuit-worthy at best, and illegal at worst.

Summary

But I said I’d take you back to the seemingly innocuous question, as over the years it’s troubled me nearly as much as the obviously horrifying commentary on my wife’s figure during an interview. “How does a woman get into engineering?” As one of my wife’s friends said, the proper response was “the same way a man does”. The subtlety of this is perhaps more insidious than the overt sexism of the crude comment, as the implication is “why are you here, you don’t seem to belong?”.

If you take anything away from this post, I want it to be this sentiment: as an interviewer, that candidate across the desk/phone from you is there because they believe they’re qualified for the opportunity and want to work with you and your company. Everybody’s career journey is different, but the subtle or overt implication that because they don’t fit the stereotype in your head they don’t belong there is simply unacceptable. Not only could you be costing your company the best candidate for the position, you may plant a seed in that person’s head that takes them years to overcome or puts them off their chosen career path entirely.

To end this on a positive note, this did not have a negative impact on my wife’s mentality and she’s fantastic at her current job. I still wish she would have slapped the guy, though!

 

 

SQL Saturday Louisville – I’m Speaking!

SQL Saturday Louisville – I’m Speaking!

As readers of the blog know, the last few weeks have been quite hectic in the racing side of my life, so I apologize for the delay in this announcement, but I’m thrilled to announce that I’ve been accepted to speak at SQL Saturday Louisville on August 5th. This will be my fourth SQL Saturday presentation this year (following Cleveland, NYC, and Atlanta). That was a personal goal of mine for 2017 and I am incredibly appreciative of being selected for four SQL Saturdays this year. It means a lot, especially to be selected for my “hometown” SQL Saturday (I hail from Lexington, KY, about an hour from the SQL Saturday Louisville venue).

SQL Saturday Louisville was my first SQL Saturday presentation last year, so it’s cool to bring it full circle and present a new session – “How To Keep Your Database Servers Out of the News” – a year later. I’m excited to give this presentation to a PASS event, as I’ve made a lot of changes to it after its initial creation last year. It’s certainly a subject whose importance increases as time goes on, so I look forward to giving the talk and getting the feedback from the audience.

Beyond me, the speaker’s list for this year’s edition of SQL Saturday Louisville is fantastic. As always, there will be lots of good information disseminated on a wide variety of data platform topics and the presenters are a who’s who of SQL Server and Microsoft data platform experts. If I’ve piqued your curiosity, click here to register for the event.

This year’s event also features three outstanding pre-cons – the information on those can be found in the middle of this page. While all three sessions will be outstanding, I’ll be attending my buddy Josh Luedeman’s (b|t) pre-con on “Building Your Modern Data Architecture”. I had the good fortune to work with Josh before he moved to Microsoft and I’m really looking forward to this session.

I can’t recommend this year’s edition of SQL Saturday Louisville enough. If the outstanding speaker list and great pre-cons haven’t convinced you, don’t forget that here in Kentucky we have delicious, delicious bourbon. Bourbon and free SQL Server training – what a great way to spend a Saturday! Hope to see you there!

Old Habits Cost You Money

Old Habits Cost You Money

As a consultant, I spend a lot of time with customers whose most significant pain point is what they’re spending on SQL Server licensing. In general, they’re all facing a similar scenario: they’ve found an architecture that works for them and as they scale that out for new clients or new users they continue purchasing the same servers (with the same version and edition of SQL Server) that’s always worked. While there’s nothing wrong with that, eventually management starts asking some questions:

  1. Why do we need all these servers when IT says they’re barely using any CPU?
  2. What do all these servers do?
  3. Why we are using X-year-old software?

As DBAs (especially those of us who wear the architect hat as well), we’re in a constant battle between priorities 1 and 1A: ensuring maximum uptime for our customers and spending the least amount of money to achieve that uptime. Settling for an older architecture on an old version of SQL Server does a great job fulfilling priority 1 but, generally, a poor job fulfilling priority 1A. The more money we spend on licensing, the less we have to spend on training, new hardware, etc.

It’s incumbent on us to keep abreast of the evolution in the SQL Server universe. As we’ve seen, Microsoft has massively accelerated the pace of their development in the SQL Server space, whether we’re talking about the database engine itself or Azure SQL Database or something in-between.

Can your company save money and provide required uptime by a move to Azure? Do you need to upgrade to SQL Server 2016 SP1 but downgrade to Standard now that in-memory OLTP, advanced compression, and greater partitioning functionality no longer require Enterprise Edition? Do you need to use something like ScaleArc to ensure you’re leveraging your complete Always On availability group investment?

This blog would be thousands of words long if I delved into every single option, but my point is a simple one. As things in the SQL Server universe change by the month rather than by the year, we all need to keep up with the latest developments and think about how they might make our job easier and/or our architecture less expensive to license and maintain so our company can spend more money on their most valuable resource – us!

Read blogs, follow SQL Server experts on Twitter, attend SQL Saturdays, and make plans to attend PASS Summit so you can stay on the cutting edge of cost-saving developments. If regular operations and maintenance keep you from having the time to reevaluate your architecture, engage a Microsoft data platform consultant (like me!) to help you in that evolution. We all know old habits die hard, but they can cost you and your company valuable resources as well. Engage with the community to help break out of those old habits (and learn cool things too)!

Achievement Unlocked: I Raced at Indy!

Achievement Unlocked: I Raced at Indy!

As Twitter followers of mine may have noticed, I raced at the Indianapolis Motor Speedway this past weekend (6/9-6/11) as part of the Open Wheel World Challenge. I competed in rounds 3 and 4 of the Hoosier Tire US Formula First Championship Series. I’ve competed in this series for years, but I’ve only done one race in the last 18 months due to engine issues with the car and commitments for my kids. Between that and the fact that this race was at the Indianapolis Motor Speedway, let’s just say I was looking forward to the weekend a tiny bit.

This particular weekend was structured differently from a normal US Formula First weekend as we had a practice session on Friday morning, a qualifying session Friday evening, another qualifying session Saturday morning, then a race Saturday afternoon followed by another race on Sunday morning. Below I’ll offer a brief rundown (with a couple pictures) of the weekend for those who are interested.

Friday Morning (Practice 1)

Friday morning’s session, in all honesty, was quite boring. As my final event last season ended up with a blown engine, the first session this weekend was a session spent breaking in the newly rebuilt engine. That means lower RPMs and, honestly, about 80% throttle. While it was a wonderful time to savor lapping at the Indianapolis Motor Speedway (road course, but still), it resulted in times 20 seconds off the fast time and a whiny driver! Breaking in an engine is necessary, but definitely frustrating.

Friday Afternoon (Qualifying 1)

Following a change from the break-in oil to racing oil, it was time to get the maximum from the car as we began to set the grid for Saturday evening’s race 1. Unfortunately, after only six laps of the session, my car developed a miss and various other electrical maladies and I was only able to manage a 1:49.73. While that was good enough for 7th in my class, it was a frustrating session that led to a long night diagnosing and fixing the problem with the car. Long story short, the distributor clamp failed and the timing was so far off as a result that the engine would not refire after I stopped in the pits. Were it not for the generosity and knowledge of fellow racers Doug Seim and Dave Dawson, we would have been hard-pressed to make the Saturday morning session. In the end, though, the car was repaired and we were ready to go for qualifying 2 Saturday morning.

Saturday Morning (Qualifying 2)

After the previous night’s repairs we took to the track Saturday morning with high hopes of improving the pace and our position on the grid for race 1 Saturday afternoon. I managed a 1:48.1 in the session (1.6 seconds faster than qualifying 1), but given the competitiveness of the field I was still starting 7th in class (11th overall) for the Saturday evening race. We made some handling changes to the car that we thought would improve it and got ready for race 1 Saturday evening.

Saturday Evening (Race 1)

indy_crash_edited

Race 1 was significantly affected at the start by the incident pictured above. After what was, in my opinion, a delayed green flag, the cars starting in positions 9-20 (roughly) got a flying start while those of us in the front had to throttle back a bit in order to not jump the start. That led to 4 and 5-wide racing down the front straightaway…which led immediately into the incident pictured. Many thanks to Brian Schell for the image above – it is a brilliant shot! As you may see, I’m near the back of the image while my friend Sam Farmiga is being used as a ramp by a fellow competitor who misjudged the braking zone. Somehow, I made it through the melee and soldiered on to a 5th-place finish despite being hit by a lapped car with two laps to go. It was a solid finish, but no trophy. Sunday morning’s goal was a trophy.

Sunday Morning (Race 2)

File_003

Sunday morning’s goal was two-fold. First, I wanted to improve my lap times into the 1:47 range, at least. There were some handling issues with the car that likely prevented race-winning speed, but I thought 1:47 laps were attainable. Second, I wanted a trophy so I needed to be third. One of the grid marshals told me that if I used the duck umbrella (pictured above) I would be on the podium, so I gave it a shot!

Sunday’s race started with two significant incidents. My friend Doug Seim was knocked into the air and out of the race by a fellow Formula First in the second corner and, directly behind us, a massive incident occurred on the front straightaway with 6-8 cars caught up in the melee. That triggered an immediate red flag and a ~25 minute visit to pit lane waiting on the cleanup. Thankfully there were no significant injuries from any of the first lap adventures.

When the race restarted I found a little more pace in the car (breaking into the 1:47 range with several lap times) and my fellow Formula First competitors found a bit of bad luck with reliability. After a furious 4-lap battle to pass some very quick Formula Vees and a 3-lap battle with Sam Farmiga for second in our class, I ended up third at the yard of bricks by a couple feet. I was frustrated at the time, but receiving a third place trophy from Indianapolis with my kids there to watch me (pictured below) soothed the frustration quite quickly.

For the next couple months this blog will return to Microsoft Data Platform-related content, but if you’ve read this far, thanks for reading! Hopefully I get to play race cars another time or two before the 2017 racing season comes to a close.

indy_podium_kids

We Interrupt This SQL Server Programming To Bring You Racing from Indy

We Interrupt This SQL Server Programming To Bring You Racing from Indy

I generally use this blog to let people know about community events where I’m speaking, pass along Microsoft Data Platform-related technical information I’ve found useful, or to participate in T-SQL Tuesday blog parties. For the next few days, however, this blog is going to be home to my updates from my racing debut at Indianapolis Motor Speedway. I’m driving in the Open Wheel World Challenge this weekend and, honestly, I lack the words to describe how cool this is.

Every racing driver (or at least every one that grew up in the Midwest of the United States) dreamed of crossing that yard of bricks in any car. The fact that I am able to run at Indy this weekend, in my own car, with family and good friends supporting me is honestly hard to believe. Hopefully we have a good, clean weekend.

After this Sunday, I plan to revert to more disciplined technical blogging. For the next few days, though, I will update this blog with racing updates as often as I am able. If you’d prefer to follow along on Instagram, please search the hashtags #jaygoracing and #sqlatspeed. Talk to you from the track!