Contact Us

Use the form on the right to contact us.

You can edit the text in this area, and change where the contact form on the right submits to, by entering edit mode using the modes on the bottom right. 

         

123 Street Avenue, City Town, 99999

(123) 555-6789

email@address.com

 

You can set your address, phone number, email and site description in the settings tab.
Link to read me page with more information.

mars horizon.jpg

Essays

The Bar

Sean Moore

You must be this tall to be taken seriously.

I’ve never liked the notion that my life is little more than climbing the rungs of a ladder. The notion that life is judged by how high you or I are at every given rung, or more importantly, which one of us is higher. Salary, possessions, status, number of direct reports. None of that comes with you in the end, and none of it leaves any sort of meaningful mark on the world. What’s more, this hierarchical, pack-animal mentality feels so antiquated, on the evolutionary time scale. We developed higher brain structures for a reason, so it seems silly to revert back to what amounts to little more than teeth-bearing and tails between the legs.

Still, there is some truth to this idea of different levels, in pure intellectual and skill-set sense of the matter. At any given time you and everyone else lie on some continuum between ignorance and mastery. For the vast majority of things in this world, the gauge hovers around that ignorance level, and that’s perfectly fine; given finite time and attention, you can only do so much with your life. What’s important is that those gauges that you do focus on are always moving in one direction.


Broadly speaking, there’s three regions in that continuum – the beginner, the intermediate, and the master – and they each come with their own sentiments. The beginner knows he is a beginner, because until five minutes ago, he had no idea there was even a Wikipedia on the new-found interest he has taken up. Similarly, he has a pretty clear sense when he is no longer a clueless novice on the subject: typically it lies somewhere between five minutes of reading and taking a college course that you only half slept-through.

Similarly, the master has a strong grasp on where she lies; she’ll write books, she’ll give lectures, and she’ll be the one person in ten thousand that goes to a help forum to give answers, rather than seek them.

That leaves a lot of ground to cover in the middle, between them though, and the poor intermediate to wonder where the hell he is on the scale. Sure, he’s no beginner; in fact, he knows more about whatever his passion may be better than ninety-nine percent of the population.

But that remaining one percent may know his passion ninety-nine times better than he. He puts his all into improving upon his skills: building habits, devoting his spare time, reading, writing, studying, and still the gulf feels as though it does nothing but widen. The masters make the rules, and it’s all he can do to follow them.


I’m reminded of a rather apt saying: the only skiers that need poles are the beginners and the true experts. The beginners need poles, of course, because foregoing them means certain doom. The experts need poles because they’ve learned to use them as precise tools.

When does what you have at your disposal stop becoming a crutch and start becoming that tool? More importantly, how do you know when one has suddenly become the other? Maybe the only way is to keep your head down, eyes on your work, and your arms outstretched before you, searching for that next rung.

The Calling

Sean Moore

Somewhere, the Tarzan to your Mary Jane is swinging through the jungle on a vine.

One of the best pieces of advice I received was on finding a calling. Keep trying things, she said, until you find whatever it is that you you love doing and that you find yourself unable to do. That’s when you’ll know.

That seemed strange and counterintuitive to me at the time. There were plenty of things that I loved doing – building Lego, making paper mâché volcanoes – and there were also plenty of things I found myself unable to do – knitting, playing the bagpipe – but the two regions didn’t seem to have any sort of overlap. This didn’t seem like good advice.

And then I discovered software design. Here was something that fit right in the center of that Venn diagram. There was the design aspect of it, being able to create whatever you imagined, to do whatever you want, a love worth having. And then there was the use of a programming language to accomplish it, which I had no grasp of, a task worthy of being unable to do.

So I took the advice I was given. And then banged my head on the wall for the next two years trying to learn something so foreign to me, so rigid compared to the fluid way I thought. The strict syntax clashed with my typical style of intentional grammatical rule-breaking. I pulled my hair out, spent countless sleepless nights going nowhere, and I had little to show for my efforts that spilled out to the rest of my life.

And I loved every second of it.


Some challenges are born into, and they must be faced for no reason other than that they are there. Some challenges, too, are thrust upon us, and we have no control over when they arrive, how they present themselves, or what tools we have to solve them. In either case, the cause is out of our hand; all we can do is try to make the effect we want

But we also have the ability to seek out challenges. We can approach what we want, based on our interests, on our schedule, with the tools that we have at our disposal. There’s a very compelling desire in us all to approach the familiar: after all, with all these challenges that we have no control over, there’s enough uncertainty in the world.

But there’s so much more to find in the unfamiliar, the exotic, the challenges in our lives that we haven’t yet exposed ourselves to. By bringing ourselves to bear against something we know nothing about and are unprepared for, we may just learn something incredibly value about ourselves.

Perhaps, even, we may find a calling.

The Plan

Sean Moore

Avoid the travel guides; the best way to have an adventure is to have nothing more than a good map.

I’ve never been a great planner. If you look at my calendar, apart from a list of my classes (which, truly, I mostly keep to remind myself what room the lecture is being held in on the days I do decide to show up), there’s very little structure to my day.

Structure has always worried me. Regularity, habits, any of it: I’ve done as best I can to avoid keeping a regular schedule. Why? It’s not only because I want to throw off the Chinese spies that are undoubtedly plotting my kidnapping. I want to preserve, as best as possible, my spontaneity.

Am I a spontaneous guy? Well, no. Apart from occasionally deciding to spend a whole night on a deep dive through Wikipedia’s maritime warfare section, or writing a web app from scratch, I typically don’t out of the blue decide to do crazy spur of the moment things.

What I do try to do though is keep my opportunities open. An open schedule means a last-minute call or text can be answered in the affirmative, a suggestion to start a cool project can be acted on.


There are drawbacks, of course, because every decision has a compromise. Lack of structure is a terrible thing. When you don’t know where you are, or where you are going, for the next hour, day, week, it’s easy to get lost.

Worse, perhaps, are the big gaps of time during the day. When there is nothing to tell you what you should be doing, it’s easy to make the wrong decision. You know, the one where you avoid all your commitments, all your homework, all your projects and plans and go on the aforementioned all-night Wikipedia binge.

Oops.

Perhaps the worst part of it all is the fear. When there is no piece of paper dictating what and where you’ll be spending your time, when your parents haven’t scheduled every activity for you from now until whenever you graduate magna cum laude from their alma mater, it can be scary. It can even be as simple as making a choice for lunch – when you suddenly are free to choose what you want to eat, rather than abide by a shool schedule, you suddenly are far more concerned that you’ll make the wrong decision.

That fear can paralyze you. Every decision you make could be one that sends you down a wrong path, and what’s worse, you may not even realize it until it’s far too late to turn back. You become a reactive machine, unable to deal with anything in the future because you need to make all these decisions that are happening right now, and you don’t have time for that.


Extremes are scary. Luckily, there’s usually a compromise when two ends of a spectrum are untenable. And sitting in between a regimented schedule and spontaneous wreckage is the plan: a description of the what how and why of what you want to get done, rather then the when. Maybe the distinction is subtle, but the ability to dissociate the work you do from a particular time you assign to do it is incredibly freeing: you can work on what you have the energy to do, or the tools at hand, and still maintain the flexibility to do your spur of the moment adventures.

The plan can be as detailed or as flimsy and high-level as you like; something as simple as where you want to be in three years, for example, probably doesn’t need to have a whole lot of detail becuse there is a whole lot of space between the you of today and the three-years-from-now self that you’re trying to tell who to be (he probably wouldn’t appreciate you telling what to do; after all, the you from m=now certainly doesn’t). If you are working on updating a design, the more detail is better – you’re walking the coastline after all, so you might as well give yourself as many roadmarkers as you can so you stay on track on time, and don’t lose sight of what you need to finish.

My personal favorite are the plans I make during a two-week design sprint. It starts as a major feature, one or two words. Over the next day, that gets broken down into the components that make it up, and then as those are implemented, those each get broken down into five or six tasks, and a similar number of associated subtasks, and before you know it, you’ve got a nice fractallized plan and more than enough to fill up your plate for two weeks, and you can work on what you’re in the mood for, rather than what’s sitting on your calendar for any given day.

The Wall

Sean Moore


We don’t need no education.

Yesterday I talked about running, and how getting started can be the hardest part. But there’s another part of the run that I dread – more so, perhaps, than getting started itself.

The wall. That terrible feeling you get, when your body is telling you “no more!”, or your brain is telling you “you have more important things to do than run right now!”. It hits all of a sudden, without warning, and when it comes all motivation disappears. There’s no energy left in the legs, no drive to take one step further.

I admire those runners that have been at it forever. They always say, once you get past that wall, that’s where the runner’s high is, the true enjoyment. That’s where you want to be. If only

Most of the time, when that wall hits, I slow down, I drag, and I might even come to a stop, if only temporarily. There’s no better way to show myself that there’s nothing to fear about hitting the wall than taking a break for a second and letting it pass. But, in a way it’s a way of giving up the fight.


Walls are everywhere – no, no, not those walls! – ready to show up in all areas of what we do. The always appear when we are facing the unknown, when we challenge ourselves. Sure, we could avoid them by solving uninteresting problems, but where’s the fun in that?

And in the same way, too, what lies beyond the wall is what’s really interesting. In starting a design problem, or an engineering implementation, or what ever knowledge work you may be doing, our first attempts draw upon the knowledge that we know. The wall is that lurking realization that we really have no clue what we’re talking about. It’s that barrier to entry, that learning curve, where you need to truly understand the problem in order to succeed.

But once you’ve come to grasp the field, the real interesting work begins. You’re on a level playing field again; rather than learning the system, you could do work. And that’s where that high comes, where you get lost for hours enjoying solving problems.


Every once in a while, rather than letting the wall drag me down, I let that fear invigorate me. It fills me, and rather than stopping me in my tracks, I take off. Outrun the wall, and you’ll outrun that fear, that drain too. One foot at a time, one brick at a time, and the wall starts to go away.

The Beginning

Sean Moore

In the real beginning, God decided to get out of bed and go to work.

Inertia.

It seems insurmountable sometimes. I’m reminded of getting back into the habit of running, after a hibernation due to the cold weather. There’s an aure of dread – I have to get up and exercise? – that’s unescapable. It’s a choking, cloying smothering feeling of desparation. Wouldn’t it just be easier to stay inside, grab a donut, and watch people perform physical activity on TV. All too often, the procrastinator in me wins out.

But eventually I get the nerve to put on a pair of shoes and move my legs. Oh no, they most certainly don’t want to move. But eventually, my heart remembers that beating 120 times a minute is enjoyable. My feet remember that there was a time when they weren’t dragging on the pavement on the way to class. My lungs remember that fresh air exists. And my brain remembers the exhilaration getting lost in the middle of a city, forgetting whatver stress may exist in school.

And more importantly, the next day is easier – so is the day after that too, and the next one, and so on. There’s something special about a start, a simple of elegance of just proving to yourself that you can do it once.


It’s the same way when I’m starting a new project. I look at these programmer hotshots – the ones that can develop an app in the spare time of their day job (which, unsurprisingly, is a full-time programming job), that write code for ten hours straight without breaking a sweat, that get up and do it all over again the next day.

Here I am, barely writing a single line without heading to Stack Overflow or the documentation to remind myself how to call a function. Four hours, or less on a school day, and I’m brain dead and bleary-eyed. And the next day, I’ve got a programming hangover in the worst possible way.

The gulf between where I am and where I want to be feels insurmountable, and worse, it feels like it’s widening all the time. Another step forward, and it’s right off the cliff, never to return.


Okay okay – back off from the cliff.

Whether it’s running, programming, knitting, whatever, just remember, you and I aren’t born with these skills. And neither were those “masters”, those that we look up to. No one becomes great in a day. We just build upona succession of tiny little victories, until one day, looking back, we wonder how we ever got to the other side of that cliff.

And make no mistake, it does get easier. Sometimes, it’s as simple as getting started. Other times, it’s getting started and failing 100 times. Slowly but surely, ten minutes without getting frustrated or giving up, becomes fifteen, becomes thirty, becomes an hour, two hours, becomes ten.

What’s the saying? Something like 10,000 hours to make a task a habit. Sure; but it takes a hell of a lot less time to know you’ll make it those 10,000.

And more importantly, you can’t have 10,000 hours before you have one. So just start.

The Prototype Idea

Sean Moore

The prototype.

If you ever take a design class in college – and if you’re not an engineer, I have no idea why you would want to subject yourself to the high stress and the sleepless nights that come with it – you come to love the idea of the prototype.

You’re facing what seems like an insurmountable challenge: build a working, safe effective thing. Every other class has been the same routine of learn some material, practice some material, test yourself on that material, and then forget about that material for good (finals notwithstanding of course; typically, you’ll just have to cram it all in your head one last time).

Design classes are an entirely different thing. You learn something in class, and then you go work on a product that, frankly, you have no business building. You’ve never designed a working thing before; in fact, the closest you’ve ever come is building some Lego contraption over parts that have fallen off the sets that came with those neatly laid-out instructions. You’ve never done product research, or human factors analysis. You’ve never used that research to design a custom housing. You’ve never built a circuit that didn’t come with a schematic. You’ve done none of this.

What did I say it felt like? Ah yes: insurmountable.

If you imagine the process of going from some cocktail napkin sketch to finished device as happening in one step, and then you ask yourself to do that one step, of course it is going to feel insurmountable.

But what you quickly discover, is that process is anything but single-step.

Which is why the prototype is so indispensable. Here you are, this grand idea, spectacular and nubile, on the one side, and on the other is your professor telling you an idea is worth shit and how’s your device coming along and you do realize that the product is due in two months, right?

You know where you want, and you know where you want to be, but you have no idea of the direction. Enter the prototype: make something, anything – make it ugly, make it do one thing poorly. See if it is closer than what you had before, but wait, you had a silly little piece of paper and some half-cooked-up dream, so of course it’s better. Is it closer to where you want to be? If so, great, make another one and make it do a little more, or make it a little smaller, or make it stop blowing up when you press that button! If not, well, make another one anyway, but don’t do that again (and maybe let other people know what you did wrong so others can avoid it in the future).

In this way, you barely realize the progress your making, but if you pick your head up at two in the morning after a month and a half, and look at the little graveyard of terrible designs you’ve discarded, you may just realize that you’ve got a half decent product on your hands.

Now get back to work.


I bring up the prototype because there isn’t really that sort of thing in the abstract world. You have ideas on the one hand, these lofty concepts that are expressed in a word or two, and often carry a lot more meaning than they let on, at least to whoever thought them up. On the other hand are these concrete manifestations of ideas – a novel, or an article, or a manifesto, or a passive-aggressive letter pinned onto a church door.

There’s no middle ground. There’s no way to build something, anything, ugly but working in concept, and share it with the world, to see if your headed in the right direction, and what you can do to make your next one a little better.

A prototype idea. That sounds about right. Something that can fill the void between a tweet or a status update, and a blog post. Somewhere in the range of 100–250 words. Easily readable, easily understood, easily shared, and perhaps most importantly, easily changed based on feedback.

If it sounds like a silly idea, hopefully you can at least appreciate the irony that it took over 700 words to say it.

Human-Scale Social Networks

Sean Moore

Two’s a crowd. Three’s a company. And ten million is a nightmare.

I was introduced to Twitter three times. I parted with it three times, too.

The first time, I came to it far too early. Before I had any interest in social networks outside of where I sat in the cafeteria for my half-hour of freedom (high school, sadly – not prison). I was part of a quickly growing minority of people at my school who didn’t have a Facebook or a MySpace page; there just wasn’t any appeal to cataloguing my life, to subject my internet self to the silly dramas of high-school triviality.

Had my interests then been what they are now, I probably would’ve made a lot more sense of what was going on. But they weren’t, and it didn’t – I looked, didn’t understand what I saw, and I left, without giving any of it a try.

The second time, I was beginning to be much more internet-savvy – in college now, an engineer, finally exposed to programming that didn’t involve a calculator or a spreadsheet. A little more social-network aware, too – finally, said all of my friends, when I got a Facebook account a week before I left for University. Curious, I actually signed up this time, started following people. And I still didn’t get it. The brevity, the subject matter, just the idea of sharing tiny little snippets of your life still didn’t make sense to me. Less than a week, and I walked away again.

The third time wasn’t even more recent. An avid follower of technology now, the service was constantly coming up in RSS feeds, blog posts, tech sites. Always a little bit of irony when I read about how Twitter was making RSS ‘obsolete’, through my newsreader of some techwriter’s feed. Twitter was the future now, and I was certainly showing up late. So I tried again, this time following the writers I followed as a reader, trying to use the service as they did. And still it didn’t make sense to me. It was just too much, too noisy, too big. So I mostly left, only checking in from time to time if I was looking for something interesting to read that I had missed elsewhere.


Perhaps my experience is dissimilar to most. Twitter never appealed to me as a place to have a conversation, because the people having them weren’t my friends, and the conversations I was having weren’t interesting. And now, I’m not the type who’s interested in sitting, mouth agape, watching uninteresting posts from celebrities flutter by, or trying to make the same lame joke as ten million other people about the most recent happening.

But there’s something bigger there, too, something beyond just that lack of acceptance. A much more relatable feeling: existing in something too big, and just not fitting in.


Capturing the world’s attention is admirable. If you want to make it in this kind of showbiz, you have to aim at least that high, or nobody, and certainly no one holding the purse strings that you’re batting your eyelashes at, will give a shit. Go big or go home, baby.

There’s a pretty serious stigma to being small. Small size means small growth. Small growth means small value. Small value means small-time money. Small-time money means small-time existence. The half-life for small on the Internet is getting increasingly… smaller.

Small has such a negative connotation, in fact, that I’d offer up an alternative naming: human-scale. That’s a bit more faithful to the intent, really. Something that is approachable, understandable, and digestible by our brains without fancy analytics, Klout scores, or branding mark-up.

It’s true that a lot of things can’t exist at a human-scale: you can’t swim in the same circle as the president, or your favorite hollywood actor, nor can you cram your head full of what is happening in the world in 140-character bites. But a lot of things can only exist at this human scale. Conversations work great with one person, can accommodate a handful, and really start to break down when ten million people join in. Relationships are even harder to scale – they all but collapse when they get bigger than two.

There is something admirable in being small, something human-scale. Not to VCs and investors and stockbrokers, sure; but admirable to the people you build it for. Those people, the ones holding the conversations, forging the relationships, only exist at a human-scale. And really, we all do.


Here’s something that might’ve crossed your mind, savvy reader you:

*why can’t we Internet-scale the human-scale? *

That’s an interesting thought, buried some 800-words already, and probably lost to the goldfish attention span of ordinary web citizens, but I like to reward patience, occasionally.

Where was I? Ah yes, an intriguing concept. Take this massive, overloading, circuit-shorting monolithic omnipresence and put constraints on it, shrink it down. A network for your closest of friends, a little microcosm. If this sounds familiar, that’s because it is; it’s been done already, and it’s called Path – a good sign, if someone else was smart enough to found it, and some more someones were curious enough to fund it.

That sounds like the best of both worlds. The human-scale that people crave, the small, the space limiting enough to build up real communication, friendship, intimacy. The Internet-scale that investors crave, growth, growth, GROWTH, hockey-stick line chart and a fairy-tale IPO.

If it hasn’t clicked yet, I’ll make the contrast explicit: these are two diametrically opposed ideas. A design choice favoring one – no matter how small, how inconsequential – does so at detriment to the other. They’re compromised, they’re in conflict, and when it’s decision time, who do you think will win that argument?

The only way to get your human-scale network is to have a human-scale business. And who gives a shit about your human-scale problems when there’s Internet money to be made?


An App.net Epilogue

If there’s a happy ending to this long-winded story, App.net may be it, for me at least. The service is human-scale funded, by the people who use it. The conversations, the relationships, are human scale, too, and there’s nothing that feels like celebrity-acolyte interaction. In a sense, everyone there has put their money where their mouth is, that human-scale can and must work.

I certainly won’t speak for Dalton, or Berg; it could be that their ambition is to make App.net Internet-scale. But right now, it feels human-scale. And until that changes, I won’t be leaving.

The Agora Phobia

Sean Moore

If a man speaks out in a room where everyone is shouting, does he make a sound?

Agora.

Funny little Greek word. The marketplace. But it meant much more than that to the Greeks. The agora was a place to come together as a community. To hear the news, the gossip about whose goat got into whose yard, whose olive bushes are looking best this year.

For the record, this isn’t first-hand information. I wasn’t there. But the Greeks were pretty good record-keepers. And I’ll certainly take their word for it when it comes to documenting their leisure times.

The point, if there is one, is that this meeting place was a place to share ideas, tell stories, and come together as a community. What started as a matter of practicality – centralizing the commerce to he convenience of shoppers and purveyors alike – became an increasingly powerful tool in democratizing information. The latest news didn’t have to travel to every house in the city; it merely needed to be carried to the agora, where it would be dispersed.

It’s a law of entropy, of information, of matter in general: condense something into a smaller volume, and it will spread much more rapidly.


Twitter, by way of it’s CEO Dick Costolo sees itself as the modern day agora. The resemblance certainly goes beyond the superficiality of being the congregation places, the ‘water-cooler’ of their times. The Internet was a wealth of information, but the newest material took time to surface. Search engines needed to index new material. Writers had to produce that material – pick up stories, find sources, polish, publish. The top stories took time to grow, and few knew about the information until someone big picked it up.

Twitter changed that in truly amazing way. No need to write a post, no need for the news to travel from site to site before being visible. Information only needed to travel as far as Twitter,; and then, almost instantaneously everyone – an everyone on a much grander scale than the agora could ever perform – was informed. Another centralization led to another democratization – suddenly your dentist could be nearly as well informed as your newspaper editor, could even be sharing the news to that very same editor.


Centralization comes with its own issues though. Remember the Agora’s primary purpose? Twitter has one too. As much as it would like to promote itself as a place of real-time discussion and information, Twitter exists to make money. And, coincidentally, the means by which it does so isn’t far off from the original agora – gather enough people together for merchants to make it worth their while.

Yeah. Adverts. Who would’ve guessed it for a free-to-use Internet service? But it goes beyond that. Because there’s more value than just paying for a mention, for a place in the feed, for a impression on the back of your or my eyeball. There’s value to be had in these companies just being there in the first place, always live, always talking. Another channel to shove a brand down your throat.

Now that agora, a discussion place for ideas, for stories, is an awful lot louder. There’s shouting left and right – sorry, what was that, it’s getting a little tough to hear in here, and would you look at that, a chance to win a vacation if only I retweet this post – it’s more crowded now, and that same group you used to talk with suddenly feels a little bit squeezed out.

There’s a counter of course – you have full control, “sponsored” tweets notwithstanding, over what you see and hear; the agora is for you to make.

But rewarded certain behaviors has a strange little way of influencing how systems work as a whole. And make no mistake, Twitter is rewarding this increasing cacophony wholeheartedly. A fever pitch of information numbs their users to the behavior. They accept it, they embrace it, and twitter is all the better for it.


There’s another story of centralizing information that may be a more apt fit. In that one, people came together to share, to create something great out of what they were given, this common discourse. But that power turned out to be too great, and eventually those that came together were torn apart, and made to not understand one another. Without the commonality, what they created quickly fell apart. They grew louder and louder, unable to communicate.

They all just babbled along.

Other Attractors in the App Store Market

Sean Moore


Last week I highlighted some of the more interesting states that can exist for a product in the App Store market, and pricing decisions that may arise due to those circumstances: the volume driven high-visibility found in the top charts, and the more stable need-driven markets, where directed searches are the main input. I’d like to round out the discussion by considering where else points of attraction may lie in this space.

In a dynamic systems class, there’s typically some smart-ass kid (that’s typically me) who goes out of way to point out the most uninteresting part of the system: the null case, where there is no dynamism, and there never was or will be. If you’ve ever taken a differential equations course, where they taught you about the foxes and rabbits, you’ll know this as the one where there never were any foxes or rabbits ever.

Leave it to me then, smart-ass that I am, that this null case fully exists in the App Store market as well: it’s entirely possible, and maybe it’s even extremely likely, that an app will find itself where there are few sales at a low price. There are certainly a whole number of reasons why an app would end up in this graveyard, and few of them are interesting, and even fewer still are worthwhile to discuss. What is interesting however, is determining whether this is a final resting place for an app, and if not, what exactly can be done to escape.

The other interesting point is a combination of the two discussed previously: does there exist a place in the market where apps can command a high price and also drive volume purchases from exposure in the charts? Certainly marquee apps in the store come to mind: Apple’s own iWork and iLife suite, to be sure, but others as well.

What’s particularly interesting, beyond the literal piles of money that these lucky few undoubtedly make, is what properties does this point exhibit from the low- and high-volume positions.

Allow me to get a little technical for a moment: suppose this point consists of a simple addition of the stable, predictable sales of the search-driven state and the more volatile popularity boosts of the rankings. What would we expect to see? Something engineers commonly do to analyze black-box systems is to characterize their frequency response. If indeed our assumption is the case, our analysis would highlight a good deal of power in the signal, corresponding to those stable purchasing behaviors, and less power spread among higher frequencies, due to the volatility in purchasing from the top charts rankings.

Even if our assumptions did not hold true, performing this type of analysis would shed some light on the behaviors of the customers buying the app – whether their purchases are inherently cyclic (perhaps right after a paycheck), or some more erratic behavior is at hand. What you give up in temporal identity – the ability to correlate changes in purchasing due to specific events, you gain in understanding in seasonality.

Certainly worthwhile, if you intend to make a living out of your sales.

The Self-Obsession Network

Sean Moore

Why yes, I’d like to read a live-blog of your life.

Why do you post something on Facebook? Is it to proclaim to the world that you accomplished something? Do you just enjoy throwing the intimate details of your life into the winds, unconcerned with their final destination? Perhaps you like to chronicle every banality in your life for later reflection?

Or, and let’s be honest, this isn’t really up for debate: maybe you’re just an asshole that likes to brag.

What a wonderful thing that Facebook already includes metrics to measure just how awesome you are. Three shares, 15 likes, and ten comments? You must be a social media mastermind. Tell me, do you sit there, after you posted your particularly clever thought, waiting for the rest of the world to recognize your genius?

Do you get upset when the world doesn’t recognize your genius? No you’re right, it’s not you – maybe your joke went right over the heads of your friends. You can’t be blamed for your less-sophisticated friendships, after all. Or maybe someone else beat you to posting that link to that Tumblr page of that GIF of that one funny actor and a comment on someone’s weekend adventure.

Or maybe you’re just not that interesting. Maybe, just like you, all of your “friends” just blindly accept their spammed friend requests, and they couldn’t even pick your face out of a lineup (and you, silly you, have done an extreme disservice to every potential creeper, making your profile picture one of you and your twenty ‘besties’). Or what you have to say just isn’t interesting, isn’t funny, just blends in to the rest of the drivel in everyone’s newsfeed.

Sorry about that.

Or maybe instead your friends aren’t even on Facebook, eagerly awaiting your next golden nugget. Their out living their lives. You know, the ones that don’t involve sticking your nose into a glowing rectangle.


No one logs onto Facebook to congratulate a friend in overcoming some incredible journey. That’s not something a status comment, an email, a tweet, a like, anything social can capture. The important things in our life still require that physicality of human interaction, rather than the performance our digital social lives now are.

So instead, they visit to proclaim some unessential fact. They visit to hear what someone is saying about the clever little dick joke they posted. They visit to be heard. They visit to be reminded that they haven’t been forgotten.

It’s not the right kind of social. We – and I mean that in the most intimate sense of you and me – doesn’t depend on constant validation. We depends on hard work. We depends on trust. And most importantly, we depends on love.

Facebook, Twitter, and whatever else exists currently competes for our attention span. They certainly compete for our eyeballs. They may even grab ahold of that little voice at the back of our head.

But they don’t lay claim to our hearts.

Diminishing Returns

Sean Moore



Nobody listens any more. I can’t talk to the walls because they’re yelling at me. I can’t talk to my wife; she listens to the walls.


What benefit do you receive when a new user signs up for Facebook, or Twitter, or any of these social networking platforms? What benefit, in fact, do you receive when you make a new friend, or a new follower?

Let’s enumerate the obvious, lest someone later call me deliberately obscure. To be sure, every new add to the network is an additional revenue source – another eyeball to plaster with advertisements, promotions, sponsored content, and bullshit social graph promotion (why yes, I would like to know how many friends ‘liked’ KFC). More revenue, more growth, means the shooting star of a start-up can exist longer, can afford a new feature, can spend time and attention making what you use lovelier.

And make no mistake, there is a certain kind of benefit when you add a person to your own personal social graph. Another set of life’s snippets, momentarily plastering the backs of your eyeballs before being lost in the avalanche of the news feed. Another person who will guffaw at your dick jokes. Or an ever increasing supply of the boring lunches of boring people, dressed in a slap-dash filter. There’s value to content, to be sure.

Hooray! Every new user means your social network can continue to supply you with a set of features to keep track of the exciting activities of all your friends.

Long live the king.


Is there a true value to adding another user, or another friend? Is there any practical difference between your graph having ten friends, or a hundred, or a thousand?

A social network – and truly, any sort of network – should be judged by how much value is imparted to the current users of the network when an additional user is added. A simple metric, really. Do additional users add any meaningful utility to the current system?

There’s obviously diminishing returns here. Adding to a network of one is an extremely valuable operation – it’s what makes language and writing so inherently valuable. It’s unfair in many ways to compare that to adding another person in a billion-strong network.

What’s concerning though is that these additional users in these high-capacity networks are diminishing the value for existing users. Maybe they use the product differently than the people that were already there – if that usage is happening at a large scale, it could spell doom for the current users. Goodbye, thing you love. Hello, thing you love to rant about.

Remember the Mayflower coming to America? It’s like that, but with no Thanksgiving.


There’s no set in stone rule that these network-effect products have to produce devaluation when such a large scale is reached. It’s merely a consequence of poor product design. Networks need to be designed to constantly add value when new users, and whatever they happen to bring with them, are added. The product’s long-term success depends on it.

Now Wait for Last Year

Sean Moore


Live in the future, and build what’s missing.


Does this feel like the future?

Perhaps I should clarify. Allow me to explain. We are living in the future. Not our future of course – don’t be silly. In fact, we aren’t even living in our present, stuck in an imperceptible time lag of our sensation and perception that allows us to experience reality.

Thanks for nothing, Mother Nature.

But we are living in the future – the future of our past selves. This time, right now, is the future we daydreamed about, we hoped for, we anticipated. And it’s the future we built.

So let me ask again: does this feel like the future?


I remember when I first got an iPod touch, all the way back in 2007. My computing experience before that had been almost exclusively tied to a desk. And there I was, walking around during winter break, with a fucking handheld computer, that could do anything that a teenaged boy could ever want to do.

Holy shit! I can write a note in Marker Felt!

Look, look, look! My email! In my hand!

But what was most amazing, and what still is most amazing, five years later and even with a million apps a tap away, was having the entire Internet at my fingertips.

That was the future. That is the future.


Does this feel like the future?

Because for all their holy-fucking-shit-ness, that touchscreen handheld communicator, computer and personal assistant is old hat to us present-day time travelers. And what’s come up in the place? What is everyone clamoring for? It’s the social experience, and you, me, and everybody can live in the future we’re building, the future we’re careening toward, in reverse.

The great promise of the social future! Share every aspect of your life, with your closest thousand friends! Display what inspires you, with your closest thousand friends! Instantly communicate your innermost thoughts, with your closest thousand friends!

What temptation, what allure – the social revolution is full of promise. Recreating and archiving every aspect of your digital self. Collect every thought, every inspiration so you never lose it. Instantaneous communication across the world with millions of people. The future is already here, and all we have to do is live in it.


And we do live in it. And it leaves much to be desired.

Does a phone book feel like the future? What about a high school yearbook?

Or a scrapbook? Or more accurately, a city dump?

What about a sleazy advertiser with a megaphone? Or a thousand solicitors on the corner, pestering you with their flyers?

Facebook. Pinterest. Twitter. This is what the darlings of the social revolution have to offer.

Does this feel like the future?

Because it looks like the past. Such a shame; the future didn’t have to end this way.

App Store Economics – Low-Volume App Market Economics

Sean Moore

I talked previously about optimizing revenues in a top-charts, high-volume App Store market setting, and the inherent pressures of competition, visibility, and consumer demand. I also discussed how these pressures cause app suppliers to receive the greatest benefit from lowering prices to induce volume purchases.

If you recall, the discussion stemmed from David Smith’s two (admittedly anecdotal) hypotheses:


  1. Initially higher pricing signals an inherent value to the market.

  2. If your app is in the top rankings, lowering the price serves to prolong ranked status and increases overall revenue.

I’d like to discuss the former claim now, and determine if higher prices do in fact signal inherent value. What about situations where an app isn’t on the top charts, where demand is relatively stable over time, or perhaps the app fills a very specific need?

I’d like to postulate that apps existing in this situation exist in a relatively price insensitive market. Why? These apps aren’t being discovered through random browsing or scanning the top charts. These apps aren’t being purchased on a whim; instead, they are being selectively sought out. The customer is looking for a specific need to be filled, a job to be done.

In cases like this, price serves as a very different signaling mechanism. Whereas in the earlier discussion a low-price prompted compulsive purchasing – not dissimilar to advertising a sale on candy at the checkout lane of a grocery – now, price may command a certain valuation from the consumer. A low-priced app trying to fill a specific need may be seen as having some sort of deficit which forces it to command a lower market price.

More importantly though, purchasers who have been found the app’s listing by searching for a specific need are likely predisposed to compare apps by e feature set they advertise, rather than the sticker price. If an app fails to meet a specific need, the price will hardly matter to the consumer.

In either case, whether signaling or by capturing the inherent value users place on an app deigned to meet a specific need, developers of apps with stable purchase throughout may in fact increase their net revenues through a price increase. I think the line is less clear than in the case of high-volume purchasing, though. Too high a price commanded by the app may put it out of the range of reason for their customers.

There are still many questions to be answered regarding the market though. These last two posts have illuminated the more predictable reaches of the App Store market, and the steady-state behavior of each. But that says nothing about how apps arrive at their steady-state behavior.

In dynamic systems, we deem this sort of analysis a “phase plane” plot. Attractors, such as these high and low-volume states, define equilibria and steady-state behavior. But often what makes a system interesting is how these points are reached, and more practically, what outside agents with an interest in reaching said attractors, such as the developers that make a living off the apps, can do to perturb the system one way or another.

And so there is much left to discuss. Do more attractor states exist with the network? What kind of stability do these states exhibit? And where do the boundaries between these points lie, and what influences their magnitude and direction? With more analysis, some or all of these questions can be answered.

App Store Economics – High-Volume Pricing

Sean Moore

The iOS App Store is an infuriating system for the geeks that are largely responsible for developing the software found on it. Data drives this curious subspecies – number of hits, funnel conversion rates, load times, download sizes. This information obsession is what makes apps on the store so delightful; every detail is accounted for.

Which is exactly why the App Store proper – and more specifically the sales data Apple provides to developers – is such a hair-pulling, frustrating system. Developers get almost no information. There’s little more information for developers than the system output. That is to say, the paycheck Apple cuts.

As a software developer, there’s nothing more infuriating than a lack of information. But as a bioengineer, this inability to see into the inner workings of a system is a normal thing. It’s our job to understand how systems work without having the luxury of prying them open to see their inner machinations.

In that spirit, I’d like to start a series of posts to try to understand how the App Store marketplace functions, and how developers can make the most out of the system.

I can’t claim that this is real hard science, but at the same time it’s no voodoo either. I’ll be relying on the observations of the talented and wonderful developers who have been thoughtful enough to share their experiences.

What’s the observed phenomenon? David Smith in his *Developing Perspective* Podcast put forth two (admittedly anecdotal) hypotheses:


  1. Initially higher pricing signals an inherent value to the market.

  2. If your app is in the top rankings, lowering the price serves to prolong ranked status and increases overall revenue.

What can be gleaned from these two observations? Can we determine how the market acts based on these phenomenon.

Let’s tackle these observations in reverse. First, some basic understanding of supply side economics for an iOS developer. Let’s consider a native application with no developer-supplied web assets; essentially, just the “app”, avoiding server costs that are dependent on the number of users that would needlessly complicate this already simple-minded analysis.

We should start of by asking what exactly does the supply curve of a software developer look like? In the iOS app store market, software is a rather unique offering; essentially supply is unlimited. Developers can set a price and meet whatever demand is placed upon their product because their product is truly digital. This gives developers complete price control, allowing developers to optimize for revenue.

The question then becomes, “how to maximize revenue”? Forgive the simple-mindedness, but just to review our ninth grade economics, revenue is a pretty simple equation: total number of units sold * average price of units. It’s a two-factor optimization - if you can double units sold by decreasing the price of your product by less than half, then you are increasing overall revenue.

Let’s go back to Smith’s observation: for a top-ranked app, lowering prices can increase overall revenue. What does this say about the App Store if this observation holds true? If small changes in app pricing create correspondingly larger number of downloads, then the market, at least under these high volume conditions, must be price sensitive. Economists like to call this price-elastic demand. Which is just a fancy term for saying price holds high sway in the market.

The final result for developers is a little nonsensical, though. If your app is doing well, well enough to make the incredibly competitive top charts, the least sensible-sounding thing would be to lower prices. And yet, given these observations, that seems to be the best way to make a few more dollars, and customers as well.

This would be as good a time as any to mention that I am no expert, and you take my advice at your own risk. But, if that risk pays off, feel free to send a beer or two my way…

Yes, and; No, but

Sean Moore

No, No, No.

It’s easy to get hung up on that lone two-letter word and start ascribing special meaning to it. It’s easy to get trapped into thinking that by the mere act of saying “No” you drive yourself toward greater success. It’s easy to say we can’t make this work, let’s try something else. It’s easy to say no, let’s not do this because it can’t be done.

In no small part, it is about saying no, but more importantly, it’s about saying no to the things that don’t matter. No isn’t some mystical power; it’s merely a way to avoid compromising on what you believe in.


Yes, and…

That’s the first thing you learn in improv. Well, it’s the first thing you learn in HR-sponsored improv team building workshops. It’s all about accepting the relationship, and then building it. Just what HR wants - smiles and sunshine.

And sometimes, avoiding compromises means saying yes, too. Because without a yes or two, we don’t create amazing things, we don’t identify what matters most to us, we don’t understand what we can’t do without.


No, but…

That’s my take on this relationship. You’ve got to be the champion of your time and attention. Saying yes to everything you’re offered means saying goodbye to anything you hold important. Do you want to be beholden to the future of someone else’s choosing?

But there’s no reason you can’t continue on after that first little word. Gather information, set up a later date to learn more. Think, then act. Maybe what’s being offered turns out to be something that delights you.


Say no to boring things. Say yes to impossible ones. Never stop asking yourself which is which. Always keep what matters most close to your heart.

Well, Grammar

Sean Moore



“Would a rose by any other name not smell as sweet? That is, assuming you’ve remembered the semicolon.”



Would you mind telling your level of experience with programming languages?

What a loaded question.

It’s an HR rep, of course, who’s asking you. Just an orifice to some half-baked web form. A translator, a temporary measure, until our machines no longer require typing an an input to chew the cud on a line of best fit between your skills and some dead end job.

Would you mind describing that on a scale of one to ten, please?

How do you rate a skill that you’ve spent the past five years using on a daily basis? For that matter, how do you rate a skill that you’ve picked up by studying some books and watching some YouTube videos over a weekend or two?

The knowledge that the skills we learn, that we use, that we teach ourselves, don’t readily map to a neat and tidy scale won’t set the world on fire. It’s the product of our quantified world.

And yet, the people hiring these knowledge workers treat these skills – and for what it’s worth this isn’t solely about programming (though it is where I’ve seen some of the more heinous acts) – like vessels to be filled. Or a balance to be weighed against. Or some sort of standardized assessment.

As if.

Like a lot of other, though admittedly more established, professions, knowledge workers, and programmers specifically, are craftsmen and women. Just as a writer is not judged on how she constructs a singular sentence, or a woodworker is measured by her sanding technique. In every case, it’s all about the end product.


Most misunderstandings come about from errors in attribution, not comprehension. And it’s exactly the case here. Because programming languages, as we call them, on their own aren’t languages. They’re grammars.

“Language” implies this vast remapping of the mind; the ability to understand a language, in the more colloquial sense of the word, is a cognitive undertaking. Learning a new programming language amounts to little more than reading a sentence written by a different author. Some are like Hemingway, and are terse; others prattle on. Like your dear narrator.

In the first programming class I had my professor remarked that he knew twelve programming languages. Me, the neophyte, listened on, mouth agape. Nowadays, I’m not far behind (and yes, that is the humblest of humble brags).

The surrounding culture of the programming language – the frameworks, the libraries, the APIs – this is what constitutes the act of learning the language. It’s one thing knowing how to conjugate Latin verbs correctly. It’s another to give a sermon in Vatican City.

Again, it’s all about what the final product was. It’s not difficult to be a syntactically correct programmer. It’s another to make something people want to use.

Another great craftsman, Frank Vonnegut, once said of another great stylist, “You should realize, too, that no one would care how well or badly Mr. [E.B] White expressed himself if he did not have perfectly enchanting things to say.”

Make sure what you make, and what you love, is perfectly enchanting. The rest – rankings of skill or not – will follow.

Selection Pressure

Sean Moore


Some are born great, some achieve greatness, and some have greatness thrust upon them.


There’s a story worth telling. More of an anecdote, really, that one might overhear in the cell bio racket. It goes a little something like this: a doctor has a patient with HIV; they’ve been working to keep the white cell count up, the infection at bay. It’s doable, but there’s a certain inevitability. To stay on top of the virus, you have to stay one step ahead; every drug starts off mostly effective, but that effect inevitably diminishes over time. Escalation is a necessity, but with every new drug, a new resistance emerges. The body’s a war zone, and the only survivors are born killers.

It’s been a few years of treatment now, and the doctor is at the end of the rope. There are no more next-gen treatments; all the innovation has been used up – no next big thing, no new weapon. It appears that there is no hope. So instead the doctor does something unexpected. He takes his patient off all treatments. Cold turkey. Cessation of hostilities.

What a fucking joke, it seems. Giving a deadly virus full reign to run rampant through the body? May as well through the towel in entirely.

But an amazing thing happens during peacetime. What’s good for the virus when drugs are pulsing through the patient’s veins are poor qualities for thriving when the bullets have stopped flying. The game has changed, and survival instincts are traded in for traits that make the most of flourishing in the unfortunate host. And with that, hope emerges: the vulnerability to drugs can be exploited yet again.


We are all shaped by the environments we inhabit. In hostile environments we become acerbic, filled with venom; or nimble and alert, the mongoose in the cobra pit. In the pressure cooker, stress-filled time-bombs that college can frequently be, we become neurotic heads-down automatons, unable to look past the next due date for a moment to plan ahead. It’s all defuse, defuse, defuse.

It’s no surprise that people in menial jobs are so often menial people; it’s no surprise that DMV employees are some of the worst humans in the world. Survival is compulsory – wretched environments overwrite ordinary people in remarkable and horrible ways.

But it can work in our favor as well. If we want to forge ourselves into a better person, it can start by removing every anchor weighing you from reaching the heights you aspire to. Getting in shape doesn’t start with buying a pair of running shoes and an exercise book; it starts with getting rid of that 70" flatscreen in your living room and the sour cream and chive potato chips in your cupboard. Yes, the Nutty Bars too.

There’s no sense sitting alone, off in your world, pining about how you wish you were something, or someone. You are someone; there’s no reason you can’t be that someone else. Being that amazing person you want to be starts by doing something incredibly mundane. But that’s ninety percent of the battle - the start. Get out of the warzone, and you may be surprised by what qualities you never knew you had start to thrive.

Fortune favors the prepared.

Gordian Knot

Sean Moore


I was spending some time with a good friend of mine not too long ago. A perfectionist, you’d call her. The kind that will stay awake until three in the morning making sure a layout is pixel-perfect. The kind that corrects the grammar, spelling, and punctuation of any and all of your text messages, in the kindest, most honest, and definitely not annoying kind of way.

Yeah, that kind of perfectionist.

She was showing off her new hobby, a hobby her friend had taught her recently. Knitting, she called it; though it appeared, to the untrained eye, to be more like knotting yarn together, looking over it with a fine-toothed comb, and then hastily undoing the last ten minutes of needlework. Over the course of our time together, what had started as a two-inch prelude to a beautifully knit scarf, ended… as roughly a two-inch prelude to a scarf.

Knit and unknit. Knit and unknit.

I almost felt bad, being there as a distraction from her work. We’d get to talking for a while, her hands would keep moving, needles following that rhythmic pattern. Under, over. Under, over. In no time, there was quite a length.

But in that length was something insidious. Once or twice – and mind you, none of this was apparent to the untrained eye – was a stitch ever so slightly out of place. And so my friend would make all this progress, all the while unknowingly sowing the seeds of her failure. Inevitably, our conversation would pause and she would look down and see her mistakes. And every stitch would be undone, to start again anew.

There’s a problem, you see. Perfectionists see every flaw.


I’m not much of a knitter, mind you. But I too, am a perfectionist. For six months that meticulous, nervous energy went into making this site Just Right. Perfecting the layout. Setting up the sections, the about pages, the extraneous material just so. Changing a hue here, the saturation there, adjusting brightness ever so slightly. This font doesn’t look great, or that heading doesn’t work well.

And the writing itself – what a deliberation! Writing in just the right voice. Working as hard as I could to avoid the first person (and the parenthetical, too – look how that turned out).

Perfection can consume you. Perfection consumed me, for the past six months. In all that time, the one thing I didn’t do was the one thing, the only thing, this site even existed for: to write. Instead of enabling me to write every day, all this overhead made me avoid the one thing I really wanted to do.


“This is supposed to be relaxing.”

That was her answer to my question of why she started knitting in the first place. There was more than a little bit of humor in response. Because she knew, and I could clearly see, that the one thing she could not possibly have done with that ball of yarn and needles was relax. It wasn’t enough for her to just be something she did with her hands to unwind. It had to be perfect.

I tried to convince her that the only way to knit that perfect scarf was to make some mistakes. She agreed completely.

And then proceeded to undo her work for the fourth time that night.


We wish we can be perfect the first time. Especially in this age of having the best of anything, everything, and everyone, screenfuls at a time. But we have to accept that perfection needs practice. The work you do now is down payment for future perfection. Blemishes, mistakes – or uneven stitching and incorrect font sizes, as the case may be – are a necessary part of the process in becoming great.

It’s hard to accept that blemishes will happen. That they’re necessary, even. Learning occurs through making and correcting mistakes. The trouble is, where do these mistakes go to, now?

With the internet, that’s a tougher question. Not too long ago, when a writer wanted to make a name for him or herself, there were layers to hide the stories, the articles, the novels, that didnt make the cut. Publishers provided the bar: you must be this good to be viewable by the public.

But this is the Internet. Always available, always viewable. Where do mistakes go here? In the archives, for someone to come across some indeterminate point in the future.

I’m a self-conscious guy. The thought of this trailing anchor of sub-par work horrified me.

Maybe the only way to win is not to play.


Later, she put her needles away. And for the first time she relaxed. No further on her project then when she started, really. I think by then that we both had realized that this spinning of the wheels was no good to anyone. All that effort, just to end with nothing to show for it.

Nothing finished. Nothing shipped.

And as I left, she told me that maybe this time next year, she’d actually have something, anything, to show for her work. At first, I thought she was being facetious, given what I had seen her accomplish that night.

Later though, I realized, that she said it in earnest. It was a promise to me, and more importantly, herself, that she wouldn’t let her strange “tick” of needing everything to look exactly right to get in the way of making something beautiful.


All this was a very long-winded welcome to my brand new site, Belligerent Mars. I’ve set down my needles so to speak, and stepped away from my creation, blemishes and all, and recognize that no matter how much I wish it were otherwise, success and failure are on a continuum. And we have to start somewhere.

This is that start.