This is How you Inspire Team Loyalty

Here’s a simple tip for managers who are looking to inspire their teams to perform their best, to bring their most creative ideas and offer the best solutions:

Don’t take credit when presenting those ideas or solutions to clients or others.

It’s that simple.

Early on in my time as a manager the realization struck me that using “we” instead of “I” was much fairer to those on my team.

So the email to the client following our brainstorming became “We thought…” instead of “I thought…” That way the credit was spread around to everyone who had participated and the client saw I wasn’t the end-all-be-all, the only person they could ever turn to if they had issues to be addressed or questions to be answered.

The exception to that rule is, of course, when something goes badly. In those instances the responsibility was all mine.

Ultimately that’s true. We might all have contributed to the idea, but as manager I was the one who greenlit it as being ready for presenting to the client. If anyone was going to be falling on a sword, it was going to be me.

When things are good, you as a manager should step to the side and draw everyone’s attention to those behind you. When things are bad, you as a manager should step in front of those behind you and refuse to move, taking the brunt of whatever’s coming yourself.

Those are the kind of behaviors that inspire loyalty not just in this job but years down the road.

Google Rebrands Feed, Retains All Its Problems

Last year Google introduced Feed, a personalized feed of news and updates appearing below the main search bar within the Google mobile app. I took issue with that at the time, specifically calling out how it was another example of a company feeling it knew better than the individual what someone might be interested in or *should* see. People could mark stories they did or didn’t feel were actually relevant, but that’s about it.

Since that was added to the app I don’t think I’ve used it as all and suspect that pulling in that feed of updates and stories is why it takes longer than I would expect for the app to load to the point where I can actually use the search functionality. It’s made what should be a lightweight experience that’s optimized for search into a very heavy one that keeps trying to tell me I should be doing more.

google discoverFeed has now been rebranded as Discover, because all mobile news/update apps are now required to have a section with that title. A bit of new functionality has been added as well, mostly to better collect information on user preferences and guide them on their search experience. Still there, though, is the presumption that what someone does online, specifically within the app, is the best indication of what someone is interested in.

You can guess where I’m going with this.

I continue to not fully understand why these sorts of algorithmic feeds of updates and news are a better user experience than RSS feeds.

Part of the pitch behind this and every other feed like it on Facebook, Instagram, Twitter and similar apps is that users can train the AI that powers what’s displayed by voting stories up or down. Don’t want to see that story about politics? Just say so. Want to see more about music? You can tell the app and it will learn.

The great part of RSS is that it’s the exact same amount of work on the part of the reader, but without the active sending of information to a tech company, information that will likely be used in the near future to display better targeted ads to you. You opt in to the feeds of the sites, blogs or keyword news searches you want to read and when something is no longer relevant for you, you can unsubscribe from it.

The problem, as I’ve pointed out in the past, is that RSS is a “dumb” technology. In point of fact it was designed to be just that so it could work across browsers, platforms and so on. It just works, without a lot of corporate interference. That means it’s tough to monetize, though.

Which is surprising since there’s a massive amount of information about me that could be found in or derived from my RSS usage. The sites I subscribe to, the kinds of stories I save for reading later, the ones that generate enough interest to click through to view immediately…it’s all there. But it’s not proprietary, so no one wants to use it.

Discover is Google’s latest attempt to do anything but support the RSS format. Fast Flip, Currents and other now-shuttered apps have all tried in various ways to display the news. And of course there’s the late, lamented Google Reader and the company’s continued sidelining of Feedburner.

The problem is that RSS puts the power in the hands of the reader, not the company, and that’s something the tech industry doesn’t understand and seemingly won’t tolerate.

More on the Dignity of Work

Last week I broke one of my cardinal rules: Never try to have a nuanced argument with someone on Twitter.

The inciting incident was a story about Jane Fonda and the comments she made about male abusers who are attempting comebacks after lying low for a while following minor punishments after being accused of sexual harassment or assault. Some are hoping that a year out of the spotlight, in which time they’ve done nothing to publicly improve themselves or make reparations, is enough and they can now reclaim their privilege.

To that, Fonda says they should “Sweep the floor at Starbucks until you learn.”

OK. I get what she’s trying to say and that for someone who was, for instance, the cohost of a nationally-broadcast morning news/entertainment program, sweeping the floor at Starbucks might seem like the most degrading job there is.

The problem with Fonda’s comments is that by saying *some* people should be relegated to that kind of job as a consequence of their misdeeds, it necessarily follows that anyone in that job should feel it’s a form of public shaming, like convicts picking up trash on the side of the road. It’s not fair to anyone who chooses that position because it works for them or who uses it as a lifeline because other positions aren’t available to them.

That same kind of attitude, that a retail job is something to be escaped, is behind the media framing of continued stories about actor Geoffrey Owens, whose appearance bagging groceries at Trader Joe’s set off this whole conversation. Many headlines have labeled this as a “second chance” for Owens, as if he’s being lifted up out of the hell he’s been confined to since falling out of the public eye.

I’ll take the opportunity afforded by these two stories to make a few additional points not covered in my first post.

First, there were times I did actually feel acute embarrassment while working at Starbucks, especially in the first few months. Those moments of almost physical cringing usually came when someone I knew came in the story.

But what could I do? When they asked what I was doing there I explained hey, I’d been laid off recently and hadn’t found other full-time work, so I was doing what I could in the moment. If they had a problem, that was on them. What I was doing was the embodiment of the Judeo-Christian work ethic we supposedly all subscribed to and the alternative was “nothing.” My embarrassment didn’t last long as I became more and more comfortable with my circumstances and the fact I was working hard at the opportunities given to me.

Along those same lines, I’m curious what hiring managers would make of a situation like my own. Would two years of “Starbucks barista” on my resume or LinkedIn experience list read better than “unemployed?” because it shows I’m a self-starter who’s willing to roll up his sleeves and do what’s needed? I haven’t had a great deal of opportunities to ask or find out, so I’m interested in what someone might know on these lines.

Finally, if you want to know what real shame and embarrassment feel like, file for unemployment. There’s not a single part of that process that isn’t humiliating and degrading, from having to briefly explain why you need it to agreeing (at least here in Illinois) to sign up for state-generated job opening alerts, many of which are useless. Plus you have to regularly demonstrate – with specific examples – that you are out there looking for work.

Always remember: These jobs aren’t shameful, but many of the circumstances and requirements society and government have built up around them are. You can’t claim to value hard work but then overlook those in jobs that demonstrate it.

Education For Me, Not For Thee

Acting Consumer Fraud Protection Bureau director Mick Mulvaney is very concerned that people are taking out student loans to pay for college without considering the “moral” consequences of paying those loans back.

Mulvaney’s comments, best heard in your head in the voice of an 18th century French nobleman decrying the work ethic of the common man, seems to be rooted in a couple elitist, out of touch ideas.

First, that those who can’t afford college costs out-of-pocket are inherently scammers and criminals whose primary aim is to take from the system without contributing to it at all. It’s the equivalent of Ronald Reagan’s famous – and fictional – “welfare queen,” a criminal out to steal from the well-meaning taxpayer and an example of how entitlement programs encourage grifting.

Second, that everyone who goes to college is automatically handed a $150,000/year job at the same time as their degree.

He couldn’t be more wrong on both counts.

His claims came at about the same time a new study was released showing the fastest demographic group now living in poverty, as it’s defined in the US, is individuals with at least a bachelor’s degree. In 2013, 4.4% of of those with a college degree lived at or below the poverty line while in 2017 it grew to 4.8%. That growth happened at the same time other income groups living in poverty diminished.

They also come at the same time we’re making the 10 year anniversary of the 2008 collapse of the US – and global – financial market. The fact that it wasn’t worse, though the middle class has never really recovered, is largely due to bailouts for the rich whose effectiveness is still up for debate.

The idea that not paying back student loans is a “moral” choice overlooks the questionable moral position debt as a concept itself holds. That’s doubly true for education debt, which is accrued in the quest someone undertakes to better themselves and their position. Student loan debt can’t be wiped out in bankruptcy, the only kind to hold that designation. Making that even more burdensome and troubling is that many students owe more after several years of regular payments.

College in America is more expensive – and comes with less assurance of a job afterward – than it is in other countries and the student population much different than the traditional image might suggest. And while some might hold that low income students can work their way through school to help pay for it, doing so is difficult to near impossible for many.

I wonder how Mulvaney expects former students to pay off their loans when two-thirds have trouble finding work out of college, meaning they’re not earning anywhere near what’s needed to not only establish their own lives but also service the massive debt that’s been racked up. Whatever his thinking, it seems to be in line with fellow Trump administration official Betsy DeVos, who wants to make it harder for those who have been scammed by the for-profit college industry to have their debts erased.

When you put their goals and statements together, you begin to see a picture emerge: The idea seems to be that students who can’t afford to pay for college out of pocket should be discouraged, if not barred, from attending anything but a for-profit university. The loans they then take out should stick with them for as long as it takes with no possibility of forgiveness or relief. To do so, they should take whatever jobs are available, if they offer no chance of advancement, don’t come with benefits and otherwise are less than what was once hoped for.

Snapchat Opens Up User Content to Media Brands, But Something’s Missing

Repurposing user-generated content in marketing programs has been a practice since about a day after people started creating that content. Marketers quickly realized, all the way back in the early 2000s, that people reacted more positively to material generated by others like them than to staid, sometimes boring advertising material.

The latest iteration of this practice is Snapchat’s announcement it will allow media partners to aggregate and place ads against user posts on the platform, something meant to bolster the Discover section of the app and bring more eyeballs in.

What I find shockingly missing from the news is any sort of comment about how revenue might be shared with the people whose posts and other content is being aggregated.

In the old days there was an evolving conversation around the use of UGC. At first some reckless and carefree marketers would just grab a blog post or photo, yell “IT’S PUBLIC!!” at the top of their lungs and insert it or some version of it in their ad. The conventional wisdom, though, quickly changed and some standards were established, including that marketers at least had to contact the creator and get their permission. If the conversation included some sort of compensation, that was up to them.

An evolution occurred that turned this practice into what was first “blogger outreach,” where you contacted prominent bloggers in the hopes of getting review product into their hands, and then into what we now call “influencer marketing.”

Even in news media, which has tried various tactics to tap into the power of “citizen journalists,” the standard is at least that you secure authorization to use a photo or video in an official report. Unless there are details missing from the reporting on Snapchat’s new program, even that fundamental step is missing from this arrangement.

What the media partners Snapchat has lined up are getting is essentially free work. They get to use what others have produced at no expense or expenditure of resources to themselves. To be blunt, they will profit in one way or another off unpaid labor. That kind of situation was quickly seen to be problematic in the early 2000s and it’s certainly no better ~15 years later.

Snapchat and its media partners should make it clear that anyone whose content is used in an official Story will be paid something, even if it’s a small honorarium. Otherwise this looks like an exploitation of the public by companies who hope to leverage content produced by someone else because they themselves haven’t figured out how to consistently create compelling material themselves.

Which Ideas and Actions are Worth Exploring?

Kara Swisher, noted and widely-respected tech journalist, has made it clear that yes, she would interview “wandering plague ship” Steve Bannon. Swisher’s op-ed comes on the heels of Bannon’s being uninvited from The New Yorker Festival after other attendees objected to his appearance as well as a new documentary on Bannon that’s premiered at the Toronto Film Festival.

Her opinion, I think, is valid. We can’t counter opinions and beliefs if we don’t give people the chance to share them, no matter how noxious they appear. And if there’s anyone who could interview Bannon and not come off as morally neutral toward his positions and statements, it’s Swisher.

That being said, I don’t think there’s a lot of confusion or lack of clarity around where Bannon and his ilk stand on various issues. He’s spoken frequently and loudly in the way only a syphilis-ridden, anthropomorphic sewer grate can.

(Side note: For all the hubbub about Congress trying to get Mark Zuckerberg, Jack Dorsey and other tech leaders to answer questions about censorship and other topics, they should either A) just bring in Swisher to handle the questioning or B) save the taxpayers a ton of money and just find her previous interviews with these people. She’s the best.)

The question, though, remains: Who is deserving of time at someone else’s microphone? Allow me to share a few thoughts on the matter:

First, it’s alright to have standards, even if they’re subjective and personal, about who gets to speak. Someone may find allowing Bannon or any of the other sentient compost piles that frequent Brietbart comments sections to be allowable, hoping a public airing of their ideas will allow them to be discredited. For me, the line would be “have your ideas and beliefs ever caused someone to be convicted of a hate crime or crime against humanity?” If the answer is yes, I don’t need to hear from you.

Second, it’s important for those standards to be consistent. For example, if it’s defensible to interview Bannon, whose ideas have arguably lead to increases in discrimination, deportation and other harm, is it similarly defensible to interview Kevin Spacey or Louis CK, who have committed actual physical assault? If not, why is the line being drawn between someone who instigates violent and abusive behavior and someone who commits it themselves? That distinction seems narrow to me.

Third, it’s permissible to be confrontational in those interviews. Don’t just give Bannon the stage and let him spew his vitriol. We’ve heard him on countless occasions and see the results of his “burn it all down” beliefs in the policies enacted every day by the current administration. If he’s going to be given a venue, challenge him. Hold him accountable for the results the policies he advocated on behalf of and for the situation of those he advocated against. Bring the receipts, don’t just let him sermonize.

Fourth, consider also booking representatives of the groups impacted. If you want to take the “well it’s better to get those ideas out there so they can be countered” point of view, make it a point to give the same venue to those whose lives have been affected by those ideas. Let’s hear from the family of someone arrested and deported by ICE, or someone who’s lost a spouse or child to illness or injury because their health insurance was rescinded. In the last two years we’ve heard plenty from rural white people who may be slightly conflicted about supporting Trump but keep doing so but little, at least in the mainstream press, from the urban immigrant who lives in fear of being assaulted on the way to work because of their skin color.

Yes, let’s make sure we understand the extent of the infestation of terrible, hurtful ideology. But let’s also remember that while sunlight is often held to be the best disinfectant, so too is closing up the windows, dropping a tent over the house and fumigating the hell out of it so no living thing survives. At this point the former doesn’t seem to be stopping or even impeding the progress of terrible ideas, so it may be time to try the latter.

Work, Dignity and Doing What You Can

Like the rest of you, I’ve been following the story of actor Geoffrey Owens over the last few weeks. After someone noticed him working at a Trader Joe’s – a job very different from his recurring role on “The Cosby show” in the 80s and 90s – his became an interesting tale. What started out as “job shaming” has turned into not only new acting opportunities for Owens but also a lesson for everyone on the dignity of work, no matter what kind of work it is.

The story struck a nerve for me and I’ve struggled with what, if anything, I have to add to the conversation.

I’ve made vague references over the last two years to my working a “part time retail gig,” something I decided to do (with the encouragement and support of my wife) and written a bit about the challenges and rewards of doing so. But I’ve never *really* talked about what I’m doing and why I’m doing it, along with how it’s affected me.

Allow me to drop the facade and be clear.

Two years ago, in September of 2016, it had been three months since I was let go from Voce Communications. A full time content marketing job was not forthcoming, though I was ending what wound up being a disastrous, soul-killing two-month contract position that began in August. As my job search continued to come up dry in October and into November things were starting to get tight financially and I needed to do *something.* I was frustrated and bordering on depression.

The job search eventually expanded to part time retail work. Eventually I got a response from just one company: Starbucks. I’ve worked there since November, 2016.

In that time I’ve had to confront, both internally and externally, many of the issues that have come to the surface in the last couple weeks. While I’ve addressed some of these topics in the past, allow me to be clear here on a few things. Specifically, I want to dispel some myths that frequently circle part time work.

1: It’s Unskilled Work

I don’t imagine that the average Walmart or Target employee could immediately jump in and do the work of a CPA, that’s true. Some could, for various reasons, but it’s probably not many.

That being said, I don’t imagine if you pulled the average CPA out of their office and asked them to make a venti iced caramel macchiato with soy milk, extra caramel drizzle and five pumps of sugar-free hazelnut syrup they’d get it right on the first try, either.

Here’s the truth: All work is skilled work. The differences are only in the details, in the kinds of skills being utilized. The people I work with are incredibly skilled at what they do, no less so than some of the other coworkers I’ve had in other jobs.

2: It’s For the Uneducated

This is malarkey. Again, the people working part-time retail or other jobs may have a *different* educational background than someone who works an office job, but that doesn’t make their education inferior or, worse, non-existent.

If you are disparaging the educational accomplishments of those who ring you up at Lowe’s or hand you your latte, you’re likely working hard to justify your own decisions, putting down those of others as part of that process.

Here’s the truth: We have a messed up system of correlating “work” and “pay/status” in this country. Someone sequestered in a cubicle for 40 hours a week is someone seen as worth more than someone who walks a store floor for similar – or greater – hours. That’s messed up.

Not only that, but if we are saying that some people are only fit for certain jobs because they lack the necessary education, then we need to confront the elements of the system that make the necessary education available to some but not others. How are we, as a society, restricting access to college – or even failing to provide a decent high school experience – to some people who will then be shamed because they can’t get a job that’s contingent on them having a college degree?

3: It’s For the Unmotivated

100% of my coworkers at Starbucks – every single one – has one or more of the following three things going outside of the time they’re at the store:

  1. Another job, often in the service industry as well.
  2. Is in school, usually college but also a few kids still in high school.
  3. Is a parent.

How’s that for motivation?

It’s not that someone working as a full-time professional doesn’t also have extra things along those same lines going on, it’s that it’s too often assumed the part-time worker is just working this job because they have to and would rather be home smoking weed and playing video games.

Dude, wouldn’t you?

Here’s the truth: It would be hard to find a group of people who have more motivation than your average retail employee. They need to hustle every damn minute of every damn day because (see #2 above) the pay stinks so they have to stitch a few things together to make ends meet, or because they’re working to get that education everyone values so highly. If the person behind the counter looks tired, it’s not because they’re lazy. It’s because they were up until 3AM studying and then came to work and after this shift they have to go tend bar at a place across town.

4: It’s Demeaning

Oh this is an interesting one, one that forms the core of the attitudes that initially swirled around Geoffrey Owens, or anyone who’s found to be working a job that seems “beneath” what we assume their status to be.

Many people seem to think those working part-time retail or service industry jobs feel bad about doing so, self-aware of the fact that they’re working some sort of “less than” job.

Let’s be clear about this: Someone doesn’t feel demeaned until someone actively demeans them.

It’s not the work itself that creates a feeling of shame. “Hard work is its own reward,” the saying goes, and the people on the retail or restaurant floor are working plenty hard.

A feeling of shame or of doing demeaning work only comes when we’re treated less well than the counter we’ve placed your drink on. When we’re yelled at for not getting your drink right. When we’re shoved out of the way while emptying a garbage can by someone who can’t be bothered to say “excuse me” as they don’t look up from their phone. When someone can’t be bothered to make eye contact or offer a simple “Thank you” in response to our serving them.

If you stop demeaning people, they won’t feel demeaned. It’s that simple.

All of these points – and plenty of others – stem largely from a lack of empathy, something I myself have been guilty of on more than one occasion. They never worked a part time job, or it’s been so long since they did they’ve forgotten what it’s like to be on the other side of the register, or the one who’s sniped at because they don’t have an answer readily available.

That we’ve tied work so closely to identity in this country is problematic, because not only does it lead us to make decisions that sometimes aren’t in our own interest in order to keep a job but it leads us to project the attributes of a job we see as undesirable onto the person doing it. If someone is working what we see as a “menial” job, we assume that person to then be menial.

You couldn’t be more wrong.

As long as we continue to make access to healthcare, housing, good food, education and other basic necessities of life contingent on having not only *a* job but what’s considered to be the *right* job, we as a collective society are in no place to judge, much less shame, the choices someone makes regarding what kind of work they’re doing.

Working at Trader Joe’s? Cool, you are doing what you need to in order to support you and your family.

Working at Starbucks? Cool, you are doing what you need to in order to support you and your family.

Working at Wells Fargo? Cool, you are doing what you need to in order to support you and your family.

Working as a freelance graphic designer? Cool, you are doing what you need to in order to support you and your family.

Of course there’s the problem that not all jobs even meet the basic criteria outlined above. They may not pay enough to support yourself, much less anyone else. Part-time work often comes without medical and retirement benefits. So in those cases all that hard work doesn’t even have the same baseline personal benefit other jobs do. And there’s much less freedom involved because while the office worker may say “I have 30 minutes until my next call, I’m going to go get some coffee or take a walk,” the part time worker can’t as they’re required to stay on the floor their entire shift save for a couple breaks they themselves have no control over.

All of this is why I continue to believe society as a whole would be better served if people were encouraged and allowed to take six months off from their full-time jobs every few years and go work a retail or service industry job. Get back behind the counter and see how you’re treated. Refresh your well of empathy so that when you drive through Starbucks you take a minute and engage the person at the window in a bit of conversation, or just smile and say “thanks, have a great day.”

That kind of behavior from customers means a lot because it happens so rarely. When it does, though, it shows that someone still sees us as human beings worthy of respect, not losers who have made so many poor decisions there are no other options available to us.

We all, to some extent, derive dignity from what we do, regardless of what that is. If you’re treating someone with anything less than the dignity they deserve, that’s on you, not them.

When You’re Out

Paul Simon is done writing music. That’s what the singer-songwriter, who’s had a long career that’s created some of the most essential pop music of the last 50+ years, has declared, saying he felt something just click off and signal that he was done.

That sentiment is similar to comments, reiterated just a few months ago, by Simon contemporary Billy Joel, who hasn’t released an album of new tunes since 1993. Joel said after finishing the “River of Dream” record he felt he was never going to get better as a songwriter and so put down the pen.

Meanwhile, Paul McCartney is still going strong, seemingly still energized by the process of writing, recording and touring. And Prince reportedly was still putting ideas on paper or on tape right up to his untimely passing.

Writing, even the most technical and dry, requires some level of passion and commitment. You have to, in my experience, either be driven to get your ideas out into the world or to use the act of regular writing to make you that much better than you were yesterday. Something has to be behind the impetus to express yourself. There has to be motivation.

So here’s what’s important to remember: It’s OK when that motivation runs dry. Admit when you’re out of words, when you’ve said everything you want to say or have proven whatever it is you set out to. That’s going to happen to some people and their decision and attitude should be respected.

Other people will continue to have that fire in them, that demon at their heels that keeps them going one more mile no matter how far they’ve already come. Good for them, they apparently feel restless if they stay in one place for too long.

It’s a reminder that creativity is not a trait that’s universal across all creators. Some people will churn out three novels a year, others struggle with one and then find it’s everything they had in their heads. The creative impulse is given differently to everyone, both depth and breadth.

All we can do is appreciate whatever it is that’s been put out into the world by others while, as creators, not judging ourselves too harshly if we find that we’ve hit the end of the road, that we’ve successfully outrun the demon, that we’ve drunk all the water allocated to us in our time.

Struggling With Journaling

One of the most common bits of advice given to writers of any kind is to keep a journal. Getting in this habit is supposed to help us keep a routine of writing regardless of what else is happening, hone our observational skills and act as a way to let thoughts flow and so on. It’s a good idea and one that likely works for many people.

I’ve never been able to do it, at least not in the way that most of those dispensing the advice seem to mean.

My past is filled with wonderful looking notebooks – and even Google Doc files or Evernote documents – that were created with the specific intent of being a journal. I set out saying “I’m going to take 10 minutes each day and just make this happen. I’m going to write about X and I’m going to capture my thoughts and do it.” Three entries later they’ve been abandoned like a burning Crown Victoria on the side of a rural highway.

The problem for me, it seems, is that this structure just isn’t how my brain works. If I start a “journal” it will fall by the wayside. It just will.

As I read the latest piece encouraging, in this case, freelance writers to get in the daily journaling habit I realized that there was a dogmatic approach to the thinking behind the advice being offered. If I simply pushed that to the side, then I was absolutely doing what they and so many others had suggested.

It’s called “blogging.”

Let me state first that I’ve never much liked the term “blogging.” It was used first to draw a clear line between what people were doing with nascent self-publishing platforms in the early 2000s from the journalistic writing professionals who had made it through the gatekeepers of the media industry were engaging in. So it was useful at first, but it soon became a dismissive, slightly derogatory term used by those professionals to cast the format in a light that let them not take it seriously. “Blog writing” has always seemed better to me since it better described what was being done and where it was happening.

That being said, blog writing *is* and was initially meant to be a form of public journaling. Before it became a business model and first- and second-wave blog writers/editors launched media empires, blog writing was simply sharing your thoughts and opinions out loud. Sometimes those were professionally-oriented, sometimes they were more personal.

Think about the “mommy blogger” category as an example. At first those women started blogs to chronicle their experiences and frustrations with parenting and find comfort and support from others in the same boat. They were journals from women who wanted to share what they were going through with the world, either just to vent and release the words out into the world or to provide help for those seeking it.

Only later did it – and other categories – turn into a haven of “influencer marketing” filled with product recommendations resulting from agency/company outreach and paid campaigns.

Blog writing for many (including myself) is still a form of journaling, though. I may occasionally weigh in on something more news-oriented or share an industry-specific opinion or story, but even that is keeping a journal. Writing, for me, is therapy, just the kind of thing that journaling is always held up to be.

If you’re a writer who sees that same kind of advice frequently offered by those seeking to be helpful, don’t feel bad if you’re not keeping the perfect Moleskine journal each day. Look at what you *are* doing and see if there’s something that fits the intent of the advice even if the format or style is slightly different. Odds are good there’s something that fits the bill.

Sometimes The Best Word Isn’t Best

I walked outside last night and knew immediately rain was imminent.

It was that smell in the air. You know the one. The smell of dirt and decaying grass and dampness and wood.

As an adult, it’s the smell that tells you whether the dinner you’re grilling will be able to finish. As a kid, it’s the smell that tells you you’re time outside is about done.

When someone says “Smells like rain is coming.” you know exactly what that means. That smell immediately enters your nose and you remember some time in the recent or distant past that’s tied to it.

Writers are often told to find the best, most elegant word to convey a sense, emotion or visual.

The word for what’s described above is “petrichor,” one that was created in 1964 by two scientists who combined two Greek words, one for “stone” and one that describes the blood of the gods.

It’s a good word that will certainly earn you the respect of those who understand it and appreciate its proper usage.

But it’s not the best word. It does nothing to stir the imagination. It does nothing to create a connection with the reader or audience. There’s no heart or soul in it.

That’s not uncommon with scientific words and terms. They are meant to be descriptive, not poetic.

Don’t worry about finding the best word. Instead, find the best way to light a fire. Establish the setting and paint a picture, don’t just describe.