Have you ever wondered who's behind #NewRules art? It’s not some swank Madison Avenue agency or digital art students from the best design schools in the nation. It’s a collaboration between two humans — myself and Marcia — and DALL-E — an artificial intelligence program developed by OpenAI, designed to generate images from textual descriptions.
The creation of #NewRules’ art has been a long and iterative process, requiring our team to learn and leverage our years of experience and education and a whole new language, better known as prompts. By effectively doing this, #NewRules has created results that not only look great but also account for bias — something that is built into the language models, is highly problematic and needs to be unwound.
In the article, What AI can and can’t do (yet) for your business, authors Michael Chui, James Manyika, and Mehdi Miremadi of McKinsey explained that these biases in AI “have a tendency to stay embedded because recognizing them, and taking steps to address them, requires a deep mastery of data-science techniques, as well as a more meta-understanding of existing social forces, including data collection.” To illustrate the bias noted by Chui et al., I gave DALL-E a simple prompt to “give me an image of a pilot and a flight attendant” for this article. The results below speak for themselves.


One of the biggest challenges I’ve encountered when working with DALL-E is that it tends to default to white, male, and young unless otherwise directed. Oddly enough, DALL-E also has this persnickety habit of creating Barbie or Ken-like physiques. Recently, for an article on midlife women in the workplace, I asked DALL-E to create an original image of a diverse group of businesswomen, including those with average body types. DALL-E’s result was an overly buxom and hypersexualized group of women. The image, which can be found below, hit the proverbial cutting room floor and became the inspiration for this article.
Everybody’s using it
Despite AI’s apparent pitfalls, a majority of U.S. employees are utilizing generative AI tools for work-related tasks, according to a recent survey conducted by The Conference Board. Over half of these employees engage with these technologies at least occasionally, and nearly one out of ten do so daily.
Approximately three-quarters of the organizations surveyed by The Conference Board in the same study had yet to establish and communicate a clear policy regarding the use of AI within their operations. This could be a problem for these organizations, especially if their workers don’t have the fundamental skills needed to interact with AI.
The folks who do have these skills are known as ‘prompt masters.’ Their skills are largely honed over time. They have a deep understanding of history, literature, writing, philosophy, sociology, psychology, and the arts. They also have an awareness of the zeitgeist, including an understanding of current social and political trends and unwavering curiosity and creativity.
Prompt masters possess a deep understanding of what AI can and can't do. They act as translators between humanity and the complex world of AI. They know that crafting prompts isn't just about hoping for the best. It requires a careful blend of clarity and imagination.
The prompt masters realize that it is their responsibility to ensure proper racial, gender, and age representation in their initial prompts and correct for faulty outputs when they arise.
Working with AI
I use generative AI on a daily basis, and I’ve picked up a few tricks along the way to get the most out of it.
First, you've got to know what you're after. You need to tell the AI exactly what you want in detail so it knows what to create. Think about not just the format but also the message you're aiming to convey — whether in text or image — and address any and all bias upfront.
Here's a strategy I use: ask open-ended questions. This makes the AI put on its thinking cap, resulting in more comprehensive and creative outputs than just giving it basic instructions. And since you already have a rough idea of what you want, it's easier to spot if your prompt needs a bit of tweaking.
And remember, it's all about trial and error. The AI's output hinges on your instructions, so the more specific you are, the better the outcome. If the first try doesn't hit the mark, figure out what's missing or off and adjust your prompt accordingly. Keep iterating.
Seven steps to get it right
There are no iron-clad rules to get the best results from generative AI. However, there are key points that users should consider throughout the process. Here are my seven:
Clear Vision: Understand exactly what you want from AI – whether it's creating engaging content, solving complex problems, or generating innovative ideas and images. You should also know AI's capabilities and limitations and act accordingly.
Effective Communication: You need to express your needs clearly, concisely, and in a language the AI understands. Your prompts should be well-thought-out to elicit the desired response from the AI.
Creative Problem-Solving: AI can often provide unexpected results, and you must be ready to think on your feet, mix and match ideas, and experiment with different prompts and inputs to create the perfect output.
Patience and Perseverance: Not every attempt will yield success. Be prepared to try and retry, refining your approach each time. Understand that perfection often requires patience and multiple iterations to achieve the desired outcome.
Analytical Mindset: Scrutinize the AI's responses, identify patterns, and understand why certain inputs work better than others. This analytical approach helps in fine-tuning your strategy for more effective interactions.
Adaptability and Continuous Learning: The world of AI is constantly changing, with new advancements and possibilities emerging regularly. Staying informed and adaptable, willing to learn and embrace new methods, is crucial.
Ethical Awareness and Responsibility: Understanding the ethical implications of using AI is paramount, not just for your organization but also for society. Addressing bias is essential.
In the end, good old-fashioned teamwork is often key to getting AI right, especially for complex projects. Bringing together different skills and perspectives will always improve its outputs, just like it does for organizations successfully leveraging diversity and inclusion efforts.
Keep prompting.