May 24, 2023 Published by Raffy Montehermoso

Why Midjourney is truly ‘mid’-journey and not the final destination

What happens when our graphic designers use the Midjourney AI tool to generate graphics? Is the resulting image fit for a logo or illustration? Find out more.

Have you ever been frustrated because you can imagine the artwork you want to make but can't seem to get a designer to transfer it onto paper? Consider having artwork created for you simply by typing out your thoughts.

Midjourney is an AI image-generation tool that does just that. For all of its splendor and publicity, I must say that the level of creativity from a mere 10-20 word input is mind-boggling. Something that may take a designer hours to describe can now be put into an AI tool, and BOOM - 10 seconds later, you've got your design. But is it just a gimmick, or can the outputs be put to real use?

We took this to the test. We took some real client design briefs and generated two designs: one without any AI tools and one using Midjourney. The results are fascinating!

 

Testing Midjourney Against Human-Made Designs

Case 1: Football Academy Logo

Client brief: Create a logo for a football academy. The idea is to come up with a logo for a football academy that trains linemen to be the best linemen. Reflect this vision by referring to a Kenyon Green photo and combining it with several elements. Here are some images provided by the client for inspiration:

Using Midjourney to Create the Logo

Based on the client’s brief, we provided Midjourney with Kenyon Green’s photo and combined it with the prompts "flat logo design, blue and black, trenches background, blue glowing eyes, football".

Here’s the result:

Midjourney was able to come up with different images that had the Kenyon Green photo in different poses. One thing we observed with Midjourney is that specificity in prompts affects a large part of image generation. When we disregarded "flat logo design" from the prompt, the result was a hyper-realistic image.

To improve upon the output, we passed the following prompts "flat illustration, blue and black, Kenyon Green, trenches background, logo badge style, simple, flat, vector", while still adding Kenyon Green's photo as a reference.

We were able to get closer to a logo-style design, but the images are still incoherent, and it looks as though the elements were drastically added. For a final logo, none of the final output generated from Midjourney would pass. Even with minor edits. It would take a large part of editing several elements from the AI-generated output for them to be used.

However, these were able to serve as an inspiration for me to then build out a logo using Adobe Illustrator.

How Delesign’s Designers Created the Logo

The Kenyon Green photo was used as a reference for how the lineman's body position will appear in the logo. The real challenge is the hand placement based on the Kenyon Green photo. It was easier for me to place the hand correctly using Illustrator, something that Midjourney struggled to do.

I was also easily able to create the logo using the client’s branding, which was a limitation in Midjourney.

Midjourney-Assisted Design vs Human Design: Let’s Compare the Results!

In our test for the football academy logo, the Midjourney AI tool provided a brief visual color presentation and arrangement of elements. It works well in situations where only a few elements need to appear in the logo. By adding more elements, you can see that Midjourney created illustrations that looked very cluttered.

Using the exact graphics generated from the AI platform as a base image will render more hours for a design to modify rather than creating it from scratch. By creating it from scratch, it's faster to direct the positioning, and color, and add any element to the logo.

The Midjourney-assisted images had a rather disoriented output, with a lot of noisy elements added to the design. However, some of these images generated from Midjourney served as design inspiration for me.  I was able to get a good idea of how certain elements could be arranged in the logo. It was also able to generate different styles and possibilities very quickly which served as a good starting point in the logo creation. I could then use Illustrator to arrange every detail of the logo for a perfectly balanced and matched logo illustration that the client could use.

 

Case 2: Octopus Social Icons Illustration

Here is another client brief we put to test:

Client briefCome up with an illustration resembling the octopus graphics of a brand character. The detail must have tentacles with the social media icons at the end.

Using Midjourney to Create the Illustration

We had a long list of prompts used on the first try: "Vector type illustration, flat, teal cute happy octopus wearing circular eyeglasses, each tentacle holding the icon of Twitter, Instagram, LinkedIn, Pinterest, Facebook, and Google".

Here’s what we got:

The result fell short of the client’s brief with no social icons added to the tentacles and with tentacles that were too short. Although, it made a good reference as to what the mascot image would look like.

We used a new prompt, enhancing it with the client’s octopus peg to seed Midjourney with some inspiration. Here’s the prompt: "flat teal octopus illustration, smiling, wearing glasses, vector, simple". With these visual prompts added, we now had visibly longer tentacles.

These still fell short in terms of having the social media icons at the end of the tentacles. However, it did create some good illustrations as a starting point.

How Delesign’s Designers Created the Illustration

Without using the AI tool, I was also able to produce a very similar output. The difference, however, was that the social icons had a proper placement at the end of the tentacles.

The circular eyeglasses were spot-on perfect. Even the "mood" of the octopus mascot is properly illustrated, complete with a visible smile and flushing cheeks to boot.

As for the tentacles with social media icons? All were done and placed uniformly. It was the exact social media icons and were even edited to be coherent with each other. 

Midjourney-Assisted Design vs Human Design: Let’s Compare the Results!

In this case study, given that it was a simpler task where similar images could be found all over online, meant that Midjourney was able to produce a much more accurate output compared to the previous logo scenario. However, the social icons in the tentacles weren't quite what we would expect in AI-generated graphics. This is where the tool fell short as small elements were added just near the tentacles - it didn't help that none of them were actual social media icons.

The octopus illustration produced by our designer went through several iterations, too. The difference, however, was that the precise client instructions were followed (like the brand character’s eyeglasses and social icons at the tentacles).

The image generated via Midjourney almost had it perfect - if it weren't for the specific instruction of having social media icons placed at the end of the tentacles. Referring to the Midjourney image can make it easy for a designer to recreate the same graphics and build it from there.

Designers can easily modify and tweak the AI-generated images, especially for a simple illustration like the octopus with social icons we made. Maybe a few adjustments of elements and color matching. But trusting Midjourney AI to produce the exact graphics required? Not so fast.

 

What Do We Think About Midjourney?

In the process of using Midjourney, we discovered that the tool could be used to convey a client's ideas or pegs. It is useful at the beginning of the design process, speeding up the concept phase as it is important that both the designer and the client have a visual representation of their ideas at this stage.

Midjourney AI can help bridge this gap. A client could understand a designer's vision better if it's represented with an image they can reference, which a designer can produce faster using the tool.

As for the client, they may be at a loss for words to describe the exact image they want to produce. Designers can simply input prompts from Midjourney based on the client's idea.

In terms of exact elements, colors, composition, and other details of the design output. This is just one of the working scenarios we're seeing in using Midjourney. Relying entirely on Midjourney to produce usable graphics isn't possible at this point. 

What would be awesome is if Midjourney could render their output in layered files which can then be easily editable using the likes of Adobe Illustrator. Currently, as a designer, you are limited with what you can do using a Midjourney-generated output. You would either have to recreate the entire output or edit on top of the output which limits the flexibility of what you can do with it.

Let's see soon how Midjourney can assist in the art and design creation process. We're all eyes on the journey.