| Nightingale | Nightingale https://nightingaledvs.com/ The Journal of the Data Visualization Society Tue, 10 Mar 2026 14:17:31 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 https://i0.wp.com/nightingaledvs.com/wp-content/uploads/2021/05/Group-33-1.png?fit=29%2C32&ssl=1 | Nightingale | Nightingale https://nightingaledvs.com/ 32 32 192620776 Trends, Aesthetics, and Individuality: How the Internet Irrevocably Changed Fashion https://nightingaledvs.com/trends-aesthetics-and-individuality/ Tue, 10 Mar 2026 14:17:29 +0000 https://nightingaledvs.com/?p=24613 Close your eyes, and picture an outfit from the 1980s. Now, the 1990s. The 2000s. Chances are you thought of perms and shoulder pads first,..

The post Trends, Aesthetics, and Individuality: How the Internet Irrevocably Changed Fashion appeared first on Nightingale.

]]>
Close your eyes, and picture an outfit from the 1980s. Now, the 1990s. The 2000s. Chances are you thought of perms and shoulder pads first, then grungy flannels and preppy streetwear, before finally thinking of low-rise jeans and velour tracksuits. 

But if I were to ask you to picture something from the 2010s, that answer might range anywhere from colored leggings to checkered Vans. That range gets even wider when we look at the 2020s so far.

We used to have a very clear idea of which styles belong to which decade, but that distinction has gotten increasingly muddy in the last fifteen to twenty years. We’ve lost the pattern of an iconic style or two defining each decade, and it’s affected our favoritism when it comes to fashion. The 80s, 90s, and 2000s—decades with only a handful of predominant styles—rank highest when respondents are asked for their favorite fashion decade.

On average, 9.5% of respondents favored the 80s, 11.25% favored the 90s, and 8.25% favored the 2000s. 

Even when we abandon the idea of “favorites,” those decades still rank the highest when respondents were asked how fashionable they found each decade. Repeatedly, the 2010s and 2020s rank lowest on average when it comes to being fashionable decades with a defined sense of style. 

Fashion trends are increasingly speed-running their usual five stages: introduction, rise, peak, decline, and obsolescence. Instead of the usual fifteen to twenty year cycle, we’re now seeing trends rise and fall within a matter of months. What happened?

The Internet.

Internet usage has increased across all generations over the last 25 years, and with it, our access to fashion inspiration outside of current pop culture. Instead of fashion trends being born on a runway and trickling down through magazines, movies, and music videos to the general public, modern teenagers and young adults are finding their new favorite styles on their For You and Explore pages, with 42% of Gen Z listing social media as their main source of fashion inspiration.

While expressions of individuality and personality have always been a priority when it comes to fashion, younger generations now feel that burden more acutely due to their exposure to the world online. Pre-internet, you knew the people in your town, and you knew the familiar movie and music stars. It was normal for everyone to take fashion inspiration from the screen, like when Dirty Dancing had everyone in leotards, or when Top Gun boosted aviator jacket sales. In the digital era, you have access to the whole world. 

That’s not an exaggeration, either. Out of the 8 billion people on Earth, more than 5.17 billion use social media and spend an average of over 2 hours scrolling every day. Instagram and TikTok are the top platforms for young adults, with 89% of Gen Z users on Instagram and 82% on TikTok. 

Breaking those audiences down makes it even more jarring to realize how many people we’re seeing on our screens now. Instagram alone has 3 billion monthly active users, and nearly a third of them are 18-24 year olds. TikTok is no different, with a majority of its 1.9 billion monthly active users being Gen Z. Additionally, out of Pinterest’s 553 million monthly active users, 42% of them are Gen Z, often searching specifically for style inspiration.

With these sorts of numbers, it’s not outrageous to assume that a young adult in 2026 will see thousands of strangers online every day. More often than not, they’ll see these strangers jumping onto the same trends they are, but when the whole world is following the same trends, how is anyone meant to feel like an individual? How are the 71% of Gen Z’ers that prioritize personality in their style meant to feel like they’re unique?

It seems their answer is a wider range of hyper-specific aesthetic niches. Now, to a reader who isn’t chronically online, you might think the word aesthetic is an adjective describing something “concerned with beauty or the appreciation of beauty” or maybe a noun for “a set of principles underlying and guiding the work of a particular artist or artistic movement.” In the modern online fashion world, it means something a bit more distinct.

A clothing aesthetic in 2026 can be defined as, “your personal style or the overall vibe your outfits create. It’s the visual theme that ties your wardrobe together, from colors and patterns to the types of pieces you wear,” according to Copenhagen Fashion Summit. Included with the site’s definition are no less than 42 different aesthetics, such as Soft Girl, Clean Girl, Streetwear, Fairycore, Cottagecore, Witchcore, both Light and Dark Academia, and many others. 

Fairycore. (Source: Cris Ramos)
Dark academia. (Source: Murat Esibatir)
Cottagecore. (Source: Eugenia Sol)

Cottagecore might be one of the most popular, emerging back in 2019, and is essentially a romanticization of rural life. Cottagecore styles include warm and earthy colors, flowy dresses, puffed sleeves, and cardigans, while activities include gardening, crocheting, and baking bread. Overall, it’s a cozy, peaceful aesthetic that prioritizes comfort. While the general trend might’ve died a few years ago, Cottagecore has quietly lived on past its hype like many of these aesthetics tend to do.

Since Cottagecore’s heyday, aesthetics have gotten even more specific. Depop, a popular clothing resale platform, posted their 2024 Trend Report, and the following ‘core’ styles had some of the highest search volume increases: “Contemporary classics,” “Minimalist renaissance,” “Retro sportswear,” and “Indie vanguard.”

Contemporary classics is defined as an “updated take on ‘old money’” in the report, reviving preppy styles by blending Ivy League style with countryside vibes. Brands like J.Crew and Ralph Lauren are named as the leaders here, with Depop saying the aesthetic “reflects a yearning for stability and reliability.” 

The Minimalist renaissance is a return to “understated elegance,” according to Depop, and is focused on clean lines, neutral colors, and classics like cashmere and tailored coats. This aesthetic has a specific focus on craftsmanship and dedication to timeless taste.

Retro sportswear follows the more traditional trend pattern of recycling from decades prior, and pulls from 80s windbreakers and 90s athletic styles, combining them with modern flair for nostalgic yet practical outfits. This specific style’s increase could be attributed to the rise in popularity of casual sports like pickleball in recent years. 

And finally, Indie vanguard is described as “bold reimagining of 2010s indie sleaze and hipster culture,” combining grunge and punk styles with the early 2000s. Think band tees paired with knee-high boots and boas. Even better, think Charli XCX’s style during her “Brat” era from the summer of 2024.

Now, is the rise of aesthetics a bad thing? In a general sense, I don’t think so, but there is an important caveat. Younger generations not having an agreed “uniform” of sorts in favor of having specific, sometimes eccentric wardrobes is completely fine. What we consider to be “normal” changes constantly, and what was normal for trends thirty years ago just isn’t normal anymore.

However, with trends moving as fast as they do, there are significant production concerns, especially the effects on the environment. Fast fashion—the manufacturing process concerned with mass-producing clothing to keep speed with trends—eats through fossil fuels with its use of polyester and contributes up to 10% of annual global carbon emissions, only for the clothes to end up in landfills at best, and our oceans at worst.

This issue, though, might be reaching its turning point. Younger shoppers are beginning to prioritize sustainable clothing practices, and the secondhand clothing market value is going up. Even when we look at Shein, one of the most notorious fast fashion brands, its downloads were cut nearly in half between 2024 and 2025. This in no way diminishes the threat and consequences of trendy and unsustainable clothing, but it might be the beginning of the way out. 

Trends have always been part of the fashion world, but once we got the internet, they became something entirely new. When nearly everyone on Earth is able to search for fashion inspiration online, you trade a handful of decade-defining styles for a thousand niche aesthetics that live on beyond their trend cycles. The earth might not have ended with Y2K, but a new world of fashion and individuality were certainly born.

The post Trends, Aesthetics, and Individuality: How the Internet Irrevocably Changed Fashion appeared first on Nightingale.

]]>
24613
The Collaborative Blueprint: The Open Visualization Academy as a Community of Learning and Friendship https://nightingaledvs.com/the-collaborative-blueprint/ Tue, 03 Mar 2026 18:04:41 +0000 https://nightingaledvs.com/?p=24628 Let’s get the basics out of the way quickly: if you’re a regular reader of Nightingale, you’ve likely heard about the recent launch of my..

The post The Collaborative Blueprint: The Open Visualization Academy as a Community of Learning and Friendship appeared first on Nightingale.

]]>
Let’s get the basics out of the way quickly: if you’re a regular reader of Nightingale, you’ve likely heard about the recent launch of my Open Visualization Academy (OVA). And you’re probably familiar with its goal of becoming the free and open library of educational materials on information design and visualization.

(All OVA courses are free and published under aCreative Commons Attribution-NonCommercial license, so help us spread the word among friends and colleagues!)

In this article I’d like to go beyond these basics, taking you behind the scenes and explaining my motivations for creating this project.

The idea for the OVA was planted in late 2012. That October, in collaboration with the Knight Center at the University of Texas, I launched the first journalism Massive Open Online Course (MOOC) in the world. It was titled “Introduction to Infographics and Data Visualization.”

It was a low-budget — better said, no-budget — experiment: I recorded all the videos at home, and there was barely any editing or planning. And yet so many people signed up (2,000 in just a few days) that we had to close registration and open a second edition right away, capped at 5,200 students. After that, I kept offering MOOCs until the COVID pandemic hit.

Tens of thousands of people from more than a hundred countries have participated in at least one of these massive free courses. To this day, wherever I go — to attend a conference, to give a talk or workshop — I’m approached by at least one person who started their career in visualization, information design, or other data-related fields thanks to these MOOCs.

As someone who considers himself first and foremost an educator and popularizer, I can tell you that there are few things in life that feel better than that. I won’t name them here; let your imagination fly.

The idea for the OVA reached maturity in 2023, when I was giving the last touches to The Art of Insight, a book of interviews with designers I admire. While talking to them, I kept asking myself: I’m learning so much in this conversation — wouldn’t it be wonderful to have all these people, and many more, design courses about the topics we’re discussing, and publish them for free on some kind of open, collaborative platform?

Therefore, the seed for the OVA was my old MOOCs, and it germinated thanks to my latest book. Also to the fact that I could “water” the seed: as a Knight Chair, I have a personal annual budget I can use to fund initiatives I believe will be societally beneficial. The OVA will be my main project in years to come, along with a fifth book.

If you watch any of the courses already available in the OVA, you’ll probably notice that they look pretty different to what you can find in, say, Coursera or edX.

OVA courses are (deliberately) scrappier, chattier, and sometimes even contain a bit of rambling. All that is by design. Whenever I talk to a new instructor I give them the following recommendations:

Keep it professional, but not too professional. I don’t want a sanitized, perfectly polished production. I want a certain and limited amount of imperfection — the kind that signals a real human being on the other side of the screen. In the very first video of my own OVA course I say that I profoundly dislike strictly scripted, TED-like canned presentations. I find them soporific.

Keep it rigorous, but also personal. I ask every collaborator not to be objective. I don’t want view-from-nowhere courses. I’m not interested in, say, a generic ‘Accessibility in Data Visualization’ class; what I do want is Frank Elavsky’s personal take on ‘Accessibility in Visualization’ (yes, that’s already available in the OVA). I want our courses to be accurate and rigorous, but also to reflect the convictions, personalities, and quirks of their creators.

There’s already plenty of content out there that looks and sounds like it was extruded by probabilistic automatons such as ChatGPT or Claude. I want the very opposite of that.

Make students aware that they are about to become part of an expanding community of friends, which they ought to nurture. At the beginning of my OVA course I explain that I want viewers to feel like they’re sitting with me in my favorite corner at home, surrounded by my tabletop games and my books, sharing a few hours of learning and joy.

I’d like OVA viewers to feel like I felt at the beginning of my career, when my mentors at the newspaper that hired me as an intern, La Voz de Galicia, taught me our craft while letting me watch over their shoulders. Theirs was a rare form of high kindness.

The OVA is my way of honoring those early mentors, and many more that I had the luck to cross paths with throughout the past three decades. As I’ve said in recent talks, if you know that you’ve benefitted from other people’s generosity (and who hasn’t?), eventually you must strive to emulate that behavior.

Convey an ethos. In a recent talk at MIT I explained that I’ve come to believe that what I teach — both at the University of Miami and elsewhere — isn’t just a series of principles or techniques, but a way of being and acting in the world.

That’s how I’ve always understood journalism and visualization design: yes, they are professions, yes, they are knowledge domains with their own methods, heuristics, conventions, inherited practices and so on and so forth. However, to me, they are more than that: journalism and design are ways of looking at reality, while we navigate it together. They are also peculiar ways of being a human being.

I wish the OVA will carry that spirit — not just to teach people how to design data graphics, but to invite them into a particular way of seeing.

What’s next? We’re planning to release roughly one new course per month; we have nearly a dozen in the works, covering a large variety of topics.

And yet I’m convinced that we’re barely scratching the surface; if you think that you have a brilliant idea for a course, let us know. We’ll need just a title, a description (not more than 2 paragraphs!), a table of contents, and a couple of sample videos to see how well you present to a virtual audience.

If we like your idea, I’ll pay you to bring it to life, and will welcome you to the growing OVA community.

I assure you that it’s a great place to be.

CategoriesCommunity

The post The Collaborative Blueprint: The Open Visualization Academy as a Community of Learning and Friendship appeared first on Nightingale.

]]>
24628
Teo Popescu Named New Managing Editor of Nightingale https://nightingaledvs.com/teo-popescu-named-new-managing-editor-of-nightingale/ Mon, 02 Mar 2026 14:00:00 +0000 https://nightingaledvs.com/?p=24622 Beginning March 1, 2026, Teo Popescu steps into the role of Managing Editor for Nightingale. Succeeding Will Careri, Teo is honored to take up the..

The post Teo Popescu Named New Managing Editor of Nightingale appeared first on Nightingale.

]]>
Beginning March 1, 2026, Teo Popescu steps into the role of Managing Editor for Nightingale. Succeeding Will Careri, Teo is honored to take up the mantle left behind and ring in a new era of the journal of the Data Visualization Society.

Will Careri served Nightingale as Managing Editor between 2024 – 2026, after holding roles as a contributing writer and member of the core editorial team beginning in 2022. As a writer, he published two of the most-read articles on Nightingale in “Designing for Neurodivergent Audiences” and “The Visual Evolution of the Tommy Westphall Universe.” As Managing Editor, he worked with nearly 150 contributors to produce more than 200 articles. Additionally, throughout his time with the publication, he assisted in the development of three print issues of Nightingale—”Guidelines,” “Emotion,” and “Nature.”

Will Careri showcasing Issue 5 of Nightingale.

While Will is stepping down from his day-to-day role of Managing Editor, he looks forward to continue being an active editor and contributor in new and similar capacities.

Teo Popescu is no stranger to Nightingale, serving as the Content Editor from 2024 – 2025, primarily responsible for bringing the print edition to life, particularly “Issue 5: Nature.”

Outside of her work with Nightingale, she is the design, graphics and data editor at KUOW Public Radio, Seattle’s local NPR station. Her work includes running KUOW’s Trump legal tracker for the first three months of the administration, visualizing ICE arrest data in Washington state, and contributing to a joint story with ProPublica on Seattle’s potential misuse of shelter funds. Additionally, she teaches multimedia and graphics journalism at UC Berkeley and the University of Washington.

While stepping into the Managing Editor role marks her return to Nightingale, she has spent the past year focusing on launching her new data journalism podcast, “Control F“, where her and her co-host dig deep on a topic and search through research, algorithms, and assumptions to bring listeners insights on how stuff works using data and visualizations.

I’m excited to come back to the Nightingale community. It’s an honor to come back in this way—to take up the mantle Will leaves behind. I’m excited to get to know everyone better and continue to foster a sense of community. I hope that with every article we publish, the love for our craft is evident on the page. It’s what keeps me coming back, and I hope it helps sustain you in these times, too. 

Teo Popescu

The post Teo Popescu Named New Managing Editor of Nightingale appeared first on Nightingale.

]]>
24622
The Back of the Painting: On Structure, Integrity, and Data Visualisation https://nightingaledvs.com/the-back-of-the-painting/ Tue, 17 Feb 2026 16:41:46 +0000 https://nightingaledvs.com/?p=24599 In the early 1420s, Fra Angelico, a Dominican friar and painter, completed his first large-scale work for the newly built monastery at San Domenico. The..

The post The Back of the Painting: On Structure, Integrity, and Data Visualisation appeared first on Nightingale.

]]>
In the early 1420s, Fra Angelico, a Dominican friar and painter, completed his first large-scale work for the newly built monastery at San Domenico. The San Domenico Altarpiece is one of the Early Renaissance’s defining works and adorns the high altar where the friars once sang their hymns during the Divine Office. Last year, the altarpiece was removed for restoration and featured in a major exhibition across Florence. On the front, the polyptych depicts four haloed saints in a single unified space, each attentive to the Virgin and Child. The Virgin and Child are themselves surrounded by angels with vibrant multi-coloured wings, their feathers shifting though a prismatic palette that is particularly iconic of Fra Angelico’s work. 

San Domenico Altarpiece by Fra Angelico. (Source: Web Gallery of Art)

To look at the back of the high altarpiece, however, is to see an intricate collage of wood from various centuries. It serves as a physical record of how the work has been altered as tastes have changed over time. In the seventeenth century, carpenters recut the original panels and added new wood to force the piece into a rectangle. Beechwood inserts, shaped like butterflies, and crossbeams cover the surface, running against the natural grain. Poplar meets beechwood, intersecting in different directions. Each species moving discordantly with humidity and the passage of years.

Roberto Buda, a conservator who specialises in wooden panel paintings, spent close to nine months stabilising the altarpiece’s structure. Working with his team, he removed the existing crossbeams and butterfly-shaped inserts, replacing them with carefully matched old poplar wood infills aligned parallel to the wood’s grain. A new frame was added with conical springs that allow the wood to move naturally. “It’s a house,” Buda told the Financial Times during the restoration. “If you don’t have a good foundation, it doesn’t hold up. The painting will never look good if the support is not right.” 

Months later, as I sat at my laptop placing an axis in the centre of the page, I thought again about this quote.

A deliberate transition in my career was marked in 2025. During my PhD in experimental neuroscience, I learned to do many things at once. I built hardware and software. I designed experiments. I ran those experiments, analysed the data, visualised the results, wrote papers, and taught students. Academia rewards this kind of breadth and a range of technical skills accumulates quickly. Yet, I found myself most engaged at the very end of the workflow, sitting with a dataset that had not yet been interpreted. I wanted to slow down and to look for the narrative in the data. To focus not only on results but on how those results are communicated—clearly, honestly, beautifully. In academic research, figures are often produced in haste, appended at the end of the pipeline. There is a script, a deadline, a familiar plotting function. In Python, with the visualisation library Matplotlib, you can call plt.bar(), and a chart appears. Microsoft Excel goes further still, delivering a fully formed graphic with colours and proportions chosen on your behalf. I wanted to build visualisations with greater intention and technical freedom, and this is what led me to the open source JavaScript library, D3.js.

D3 stands for Data-Driven Documents and is a low-level library which uses the full capabilities of web standards such as CSS, HTML, and SVG to build sophisticated and interactive data visualisations. While other visualisation tools hand you a bar chart or a scatterplot, to represent data in D3 you must manually calculate the scales, define the coordinate system, and bind the data to a graphical element. You must decide exactly where an axis sits and how a margin breathes, what a data point is—a circle, a path, a mark—and how it behaves when the data changes. Nothing appears unless you build it. D3 is a workshop full of raw timber and hand saws.

With this in mind, I applied to the Data Visualisation Society’s mentorship program, intent on learning D3. Under the guidance of my brilliant mentor, Sam Bloom, I spent ten weeks at the end of 2025 working through the library’s fundamentals and building an interactive visualisation. We focused on first principles before developing an interactive scatterplot to explore Ancient Greek colour perception. Progress was slow at first because the learning curve was steep, but as I learned to build in D3 and perform this kind of digital carpentry, visualisation began to resemble construction. Every line of code was doing structural work. Figures included in this essay show examples of the interactive scatterplot, which examines the sensory dimensions of Ancient Greek colour by focusing on the major colour adjectives used by Homer in the Iliad. The Ancient Greek experience of colour was inseparable from motion and shimmer. Colour was a basic unit of information which reflected the natural world—encoding brightness and darkness as fundamental dimensions. Greek colour terms not only prioritised luminosity but the play of light across surfaces, the texture of materials, even the social standing implied by a sheen or shade. It was a colour vocabulary rooted in their lived perception, rather than the modern hue-based categories we use today. Selected excerpts from my D3 code illustrate how each visual element is constructed, for example, the multiple lines of code required to precisely position and size tick marks along each axis. The full project can be viewed here.

When a reader encounters a clean scatterplot, they see only the front of the painting. They don’t see the scaffolding: the decisions about scale domains or the choices about what not to encode. While the interactive scatterplot I built at the end of the ten weeks was modest, I could explain why each element existed and how it related to the data. Each decision—scale, colour, interaction—could be justified. Good data visualisations often look deceptively simple. But this clarity is the result of many intentional decisions about the data and the visual design. 

For example, ~90% of the charts the Financial Times publishes are bar charts or line graphs, yet because these charts adhere to a defined set of design principles, down to the very placement of the title and the subtitle, it makes the FT’s graphics some of the most recognisable in newsroom data visualisation. This coherence is maintained through meticulous style guides, which dictate everything from the weight of an axis line to the specific hex code of a categorical blue. These guides function as a visual vocabulary or grammar. Alan Smith, the FT’s Head of Visual and Data Journalism and who led the design of the FT’s visual vocabulary, has previously championed the idea that a chart should be as readable as a sentence. Alberto Cairo, a professor of visual journalism at The University of Miami, has often argued that the most important part of a visualisation is the “reasoning” that happens before the first pixel is placed. In his book, The Art of Insight, he argues that there are really no rules of data visualisation, there’s just reason. Every design choice must be a defensible, rational response to the data and the intended audience. 

These ideas are not confined to style guides or theory; they are persuasive when also used for animation and interactivity in visualisation. When such principles are applied with narrative intent, even complex data can be immediately comprehensible to an audience. A widely cited example of the power of a simple but intentional use of data visualisation is Hans Rosling’s 2006 TED talk. Rosling revealed patterns in a complex dataset through an animated scatterplot in which countries appeared as circles, mapped by measures such as life expectancy (on the x-axis), countries’ GDP (on the y-axis), and population (the size of the circle). As the animation unfolds, these circles shift across the axes allowing long-term trends to emerge gradually rather than all at once. Rosling paired this animation with carefully selected narration and emphatic gestures to guide his audience to the most meaningful changes as they occurred. The result was a complex global health story made easy to understand through intentional narrative decisions and clear visual structure.

The painted surface of Fra Angelico’s altarpiece is inseparable from its support. The relationship between the painted surface, the underlying preparation, and the wooden support beneath, makes the altarpiece a three-dimensional object rather than a flat image viewed only from the front. The butterfly-shaped beechwood inserts, which were set against the direction of the grain, introduced stresses that increased the risk of cracking, jeopardising the paint layer.

A data visualisation is a three-dimensional object of logic. If the underlying structure is weak, if scales are arbitrary or axes misleading, the surface won’t stand up to scrutiny. The narrative ‘paint’ (the colour palette, the interactivity, etc) will eventually crack. For example, decisions relating to the axes scale depend on what counts as meaningful in a given context. Although it is often suggested that a y-axis should begin at zero to preserve proportional accuracy, this convention can obscure important variation when the relevant changes are small, as is often the case with climate data. A review by Steven Franconeri, professor of psychology at Northwestern University, illustrates this clearly: a temperature chart anchored at zero degrees Fahrenheit flattens visible change, while a version scaled to the relevant temperature range makes trends legible without distorting the data. A widely criticised, since-removed National Review article employed a temperature chart with a lower bound of –10 degrees Fahrenheit, a choice that made recent increases in global temperature appear negligible.

Wood is a living thing and it needs to move. Buda and his team’s addition of a new, more encompassing frame made of chestnut wood and conical springs allowed the altarpiece painting to breathe through the natural movement of the wood in different directions. I developed my D3 visualisation in tandem with the JavaScript library React. In modern web development, React acts as the frame of chestnut wood. It is often described as a library for building user interfaces, but at its core it is a way of thinking about state and change. You describe what the interface should be given certain conditions, and React takes responsibility for updating it when those conditions shift. React holds the structure and lifecycle of my visualisation and D3 handles the math: scales, layouts, transitions that respond to data.

This article is not about JavaScript, or frameworks, or even data. It is about integrity in design. It is the realisation that the most important work we do as data visualisation developers is often the work that the reader will never see. When the San Domenico Altarpiece returns to the walls of the monastery, the public will only see the Virgin and Child, resplendent and serene. They do not see the new poplar inserts running parallel to the grain or the conical springs hidden within the frame. When we design good visualisations, we are doing something similar, we are building the foundations so that the story can stand on its own. We are building houses for data. Every axis, every scale, every line of code is a poplar insert aligned to the grain.

CategoriesCode

The post The Back of the Painting: On Structure, Integrity, and Data Visualisation appeared first on Nightingale.

]]>
24599
Christine and the Magic Charts: A Data Visualization Book for Kids https://nightingaledvs.com/christine-and-the-magic-charts/ Thu, 22 Jan 2026 15:50:08 +0000 https://nightingaledvs.com/?p=24566 “Daddy, what’s your job?”“Mom, what are those pretty pictures? I want to make some too!” The idea Anyone who loves their job has probably wanted..

The post Christine and the Magic Charts: A Data Visualization Book for Kids appeared first on Nightingale.

]]>
“Daddy, what’s your job?”
“Mom, what are those pretty pictures? I want to make some too!”

The idea

Anyone who loves their job has probably wanted to share it with their kids—get them excited about it, show how cool and meaningful it is. Even if they don’t follow in our footsteps, maybe they’ll at least respect and appreciate what their parents are passionate about.

Sometimes it’s just a dream, but we want to find a bright and engaging way to talk to our children about what we do for work.

Data flowers. Image provided by the authors.

That’s how it was for us—Alex and Natalia—working in the field of data visualization. We really wanted to share our world! Data visualization is amazing: it’s full of beauty and logic, sleek designs, a variety of charts, fascinating topics, and the chance to work with important data.

Moreover, working with data and visualization is not just interesting—it’s useful! Especially in our fast-changing world. We wanted to give children valuable skills early on so they’re ready to face the grown-up world.

Fragment from the book. Image provided by the authors.

We want to create shared, precious memories: to capture that magical moment when a child is still curious enough to wonder, “What does Mom or Dad do at work?”

So we thought: let’s tell and show them!

With these thoughts in mind, we started exploring the idea.

Natalia already had experience creating data viz characters and telling stories about them, but now she wanted to make stories not for adults; but for kids. Still all about data visualization. Alex already had experience writing books!

And we wanted to bring this story to life as a book!

We agreed to start the project and went off to brainstorm, sketch, and imagine!

Characters and first sketches

What’s a book without characters? Natalia decided it’d be better not to make them diagram-like people, but cute monsters or creatures. This way, they’d be easier for kids to tell apart—and we’d avoid having a big crowd of kids running around the book (great for comics, but not ideal for a storybook).

Naturally, the prototype for the girl character was Natalia’s own daughter, Maya—a curly-haired girl with red pigtails who loves bunnies. Over time, the character changed—her hair, color, and age evolved, which is completely normal.

We decided to name the girl Christine!

Christine sketches by Natalia. Image provided by the authors.

Then came the pie chart character. In the data viz community, the pie chart is often viewed with skepticism due to its limitations and how easily it can be misused. It’s unfortunate, because people do love the bright, round pie chart—it’s just part of reality. The trick is learning to use it well.

Our first diagram character was a pie chart, and we called him Piechi. Since pie charts need careful handling, Natalia imagined Piechi as a kind of dog that needs to be trained—not to overeat!

Piechi first sketches by Natalia. Image provided by the authors.

Everyone who learned about the book instantly loved Piechi. He became the mascot of the story and our favorite character—just like pie charts: lovable, though not always easy to manage.

Later, we started developing the Dad character, bits of the plot, and other chart-characters.
We tried several versions of the Dad—he’s a tired, somewhat sad data professional. But (spoiler!) this is so he can become joyful again by the end of the story.

At this early stage, the other chart characters were still not fully formed. But we did keep some early sketches of them too.

Character sketches by Natalia. Image provided by the authors.

The plot

So, you have an idea who this book is about—but what actually happens in it?

We decided to go with a plot as old as time: a girl travels into a mysterious world of data to rescue her father, who’s gone missing within it!

Alex worked on the twists and turns of the plot, inventing obstacles and adventures, vividly describing the challenges on Christine’s path to save her dad. He also dreamed up the mysterious chart characters who not only help Christine on her journey but teach her how to use each chart properly!

First plot sketches by Natalia. Image provided by the authors.

Each chart has its own personality and unique “diet.” They’ll share those secrets in the book, too!

Christine bravely journeys toward her goal—a mysterious Data Tower always shimmering on the distant horizon—accompanied by her loyal chart friends, overcoming tricky challenges to discover what happened to her father and to rescue him!

Illustrations

Of course, making a book isn’t easy. We started with the plot and text. We outlined the key story points and structure. Afterwhich, Natalia did a storyboard while Alex finished writing all the text. That’s how we finally understood the storyline, the placement and meaning of illustrations, and completed the manuscript.

Then came the time to draw!

Natalia can draw, but mainly in small formats. She didn’t have experience with book illustration, and creating book artwork takes a lot of time—especially while working and raising a small child. It became clear we wouldn’t finish the illustrations in a year… or even two. So we decided to look for help and find ourselves a wonderful illustrator!

Illustration ideas by Natalia. Image provided by the authors.

This too was a challenge—we needed a style both authors liked, someone with experience in children’s books, available time, and ideally some familiarity with data visualization.

Left to right: Lena Krapiva, Nika Korsak, Anastasiya Lykova. Images provided by the authors.

All the illustrators were incredibly talented, though we couldn’t work with everyone. But it was amazing to see different takes on our characters—Piechi in particular got a lot of interpretations!

We used Lena Krapiva’s gorgeous illustrations to promote and mock up the project website. Images provided by the authors.

We tried out a few spreads with different illustrators before finally choosing Anastasiya Lykova as our lead illustrator. She has a young child herself, so the story resonated with her—and we loved her soft and expressive illustration style.

We didn’t want the book to be just a story—we wanted it to be useful too. So we included a chart chooser, and pages with profiles on each chart-character at the end of the book.

What’s next?

To start telling the world about the book, we put together a website introducing the story and its characters—the charts! Now this website has grown into a full-fledged data project for kids: Data2Kids! It includes a children’s competition, educational materials, merch, and of course, this book.

We even want to bring together a local community of data-parents and try out this format all together!

And we wanted to create more opportunities for shared activities between children and parents.

We decided to make a little workbook for kids: with fun, simple data visualization tasks, drawing prompts, unusual challenges, and ways to spend time together collecting data and making charts. The workbook is currently in development, and we’re testing the first version with our local community!

Our cutest and most beloved character is Piechi! We don’t sell him as merchandise, but we give away these unique toys as prizes in our competitions. Image provided by the authors.

With the book finally published and a growing local community of parents and children learning data visualization alongside the book’s characters, we’re excited to launch an international children’s data-visualization competitionData Kids!

Website screenshot. Image provided by the authors.

Dates will be announced soon—meanwhile, you can already explore examples of children’s data-viz projects and educational practices from our local contest and subscribe to the project’s newsletter! 

We’d be happy to see you there! And we really hope to run more data-visualization activities for kids this spring! We also decided to create a themed workbook where the book’s characters will help children practice creating and using charts.

Book mockups—but it’s not actually that thick, promise! Image provided by the authors.

If you’re interested in the Data2Kids project, and want to help introduce kids to the world of data and dataviz, check our book Christine and the Magic Charts!

Thanks for reading!

We hope that, like us, you want to pass on the magic of this unusual but fascinating profession to the next generation!

The post Christine and the Magic Charts: A Data Visualization Book for Kids appeared first on Nightingale.

]]>
24566
REVIEW: Connecting the Dots by Milan Janosov https://nightingaledvs.com/review-connecting-the-dots/ Thu, 15 Jan 2026 16:22:27 +0000 https://nightingaledvs.com/?p=24556 In our increasingly interconnected world, Connecting the Dots: How data, networks, and algorithms shape our world by Milan Janosov could not be any more poignant...

The post REVIEW: Connecting the Dots by Milan Janosov appeared first on Nightingale.

]]>
In our increasingly interconnected world, Connecting the Dots: How data, networks, and algorithms shape our world by Milan Janosov could not be any more poignant. Janosov walks readers through every level of networks, beginning with the individual and expanding out to different kinds of connections and what we can interpret from them. He assumes no prior knowledge from the reader and breathes life into network science with a light tone and culturally-relevant examples.

Connecting the Dots is organized into three sections: “Our Data Selves,” “Networks Coming to Life,” and “Hitting the Big Time, Network Style.” After a short introduction, “Our Data Selves” eases the reader into the concept of individual datafication via online profiles. Janosov discusses both social media profiles as well as online shopping profiles, breaking down what kinds of data might be collected, how that data can be stored both statically and dynamically, and where technology may be collecting additional data about us and attaching it to our profiles, even if we don’t explicitly answer a question or survey. He concludes with the example of targeted coupon distribution and use, showing how data aggregated from many users over time can help companies predict consumer behavior.

“Networks Coming to Life” expands the reader’s purview to understand how interconnected profiles create a network. This section of the book is where Janosov’s unique approach to choosing examples shines. Unpacking everything from Game of Thrones character deaths to NFT art markets and even DJ popularity, Janosov explains the anatomy of networks, with their nodes and different types of links, and how networks are born, grow, and sometimes collapse. He engages the reader in discussions of rather heady scientific concepts, like weighted and directional relationships and preferential attachment, but keeps his writing accessible by using familiar topics as the backdrop.

Finally, “Hitting the Big Time, Network Style” brings the first two parts together and applies network theory to the real world. Instead of just showing where networks exist, Janosov demonstrates the utility of these mathematical concepts in the real world. He discusses how network theory can help predict the spread of disease, increase workplace productivity, engineer successful social media campaigns, and more. The final chapter also touches on a particularly timely subject: artificial intelligence. This chapter unpacks some of the inner workings of AI and is followed by a conclusion where Janosov ties all three parts together, leaving the reader with the feeling that they’ve tackled the challenge of the book, which will help them better understand more complex discussions of network theory.

When I first started reading Connecting the Dots, Janosov’s light and often joking voice immediately set this book apart from other network theory books and articles. His voice, paired with examples that I recognized, like Game of Thrones, electronic music, and oddly specific online ads, made me want to keep reading, even when I didn’t immediately recognize some of the more network theory-specific concepts. In addition to well-placed examples, Janosov’s organization is experimentation-forward. He often explains how he devised the idea for a project or analysis before unpacking what he actually did, which made me more invested in knowing what he found.

While Janosov provides ample links to view his network projects throughout the book, I was slightly disappointed to discover that none of the diagrams were printed on the page. That said, Janosov does not stop at just linking his own work, but often mentions further reading throughout the chapters and provides a comprehensive list of references by chapter at the end of the book, making Connecting the Dots a network of information in and of itself.

By the time I finished Connecting the Dots, I felt that my grasp of network theory was greatly improved. I would absolutely recommend this book for anyone curious about behavior prediction, datafication, or network theory. Connecting the Dots has a low barrier to entry and easily sheds light on what is often a very confusing topic.


Learn more about Connecting the Dots and preorder it on its website.

CategoriesReviews

The post REVIEW: Connecting the Dots by Milan Janosov appeared first on Nightingale.

]]>
24556
Info+ https://nightingaledvs.com/info-plus/ Wed, 14 Jan 2026 16:02:12 +0000 https://nightingaledvs.com/?p=24511 Info+ is a long-standing data vis conference, held biannually in rotating locations. This year, it was hosted at Northeastern University in Boston (my alma mater),..

The post Info+ appeared first on Nightingale.

]]>
A quiet moment, before the conference begins. Image credit: Pedro Cruz

Info+ is a long-standing data vis conference, held biannually in rotating locations. This year, it was hosted at Northeastern University in Boston (my alma mater), chaired by Pedro Cruz of Northeastern and Sarah Williams from MIT. The event was an action-packed three days of workshops, keynotes, seminars and social activities, and even included an art exhibition at the MIT media lab.

Opening night exhibition at the MIT Media Lab. Photo credit: Pedro Cruz

The conference was a dose of concentrated inspiration, with a head-spinning line up of back-to-back 10-minute seminars by leading designers in the visualization field. By the second day there were definitely some unifying themes emerging from the blur of inspiration and ideas. 

You can find recordings and abstracts for all of the talks on the conference homepage. A few selected presentations are also linked below.

From communication-to towards communication-with

As someone who’s been in the data vis community for a long time, the biggest change I noticed was a shift in the general framing of data vis problems. Instead of Tufte-esque critiques of “proper” visualization techniques or discussion of misinformation and misleading graphics in politics, the conversation (at least in this conference) has shifted strongly toward more participatory practices in data vis.

Talking about inflation. Photo credit: Jose Duarte

Rather than talking about how to present data so that people will understand it, the focus was on how to have conversations—with people, using data—and how to include appropriate context and resolution to help them see how it fits into and reflects their lives. This was reflected in games talking about inflation at the grocery store and local biodiversity challenges in college classrooms, mapping inclusive and discriminatory spaces for marginalized communities to inform urban planning, and using info vis techniques to map informal transportation networks in developing nations.

Mapping exclusionary spaces. Photo credit: Sofia Burgos-Thorsen

When communicating with disenfranchised groups (like middle-schoolers impacted by extreme climate events and migrants hesitant about motivations behind the intervention), it can also be a challenge to overcome obstacles to communication, like self-censorship and diminished agency.

Visualizing marginalized perspectives

Across many talks, there was a focus on using data as a form of community expression, and using locally-generated data to capture experiences that are often left out of the dominant narrative. The conference exhibition included a project to record the important annual events for the Quecha people of the Amazon, organizing their year around important agricultural and cultural events.

Map of cultural practices created by the Quecha people. Photo credit: Catherine D’Ignazio and Claudia Tomateo

Another team used conversations with migrants to improve shelters, focusing on designing features that will support them best in their transition. Data can also help to articulate deep-rooted structural inequalities, or something as “simple” as pronouncing someone’s name. It may also help us to question what we memorialize, how, and why. 

Designing for impact

Some talks showed how to use data in a political context, as a tool for advocacy and creating change. One project focused on providing legal evidence to demonstrate systematic displacement in the West Bank, another created an archive of communities erased by urban redevelopment in Seoul.

Mapping the land of dispossessed farmers in the West Bank. Photo credit: Gauri Bauhuguna

A blanket woven from currencies served as an entry point into deeper discussions about economic impacts and the many reasons for migration, informing and humanizing policy decisions at the UN. One team collaborated with corporate sustainability offices to use biodiversity data to create better-informed sustainability policy and achieve more meaningful targets. Data can also help to illustrate what is lost when policies change, such as local shore changes for communities in the Mediterranean, and the pain caused by lost reproductive rights.

A blanket highlighting the economic impacts and reasons for migration. Photo credit: Sarah Williams

Advocacy is one form of impact; others take a more neutral approach. Some speakers discussed using data journalism to represent geopolitical conflicts in an unbiased but informative way. Others illustrated the importance of thoughtful visualizations focused on place and the need to keep things simple when dealing with the practical realities of fast-paced projects in a newsroom. Conversely, including details in your charts can sometimes make them better, more interesting, and more understandable.

Visualizing ship motions related to undersea cable damage. Photo credit: Irene de la Torre Arenas

New modes for visualizing data

Of course, the medium we choose also influences what we observe. The representation of time in social media platforms can shape and even distort our perceptions. Using different modes of visualization (including touch and sound) can help people engage with and better understand different habitats on the ocean floor.

Visualizing sea floor habitats with visuals and texture. Photo credit: Jessica Roberts

Textiles have deep traditional roots and can evoke a softer expression of meaning, especially in a cultural context. Acoustic data can have profound emotional impact as well as quantitative meaning, and mixing auditory and visual explorations can encourage different modes of exploration, as well as creating more accessible tools

Perhaps my favorite application of unexpected media was using folded paper as the basis for the conference identity, creating rich and nuanced visuals by simple physical means.

Behind the scenes view of creating a conference identity. Photo credit: Todd Linkner

Seeing the big picture

Stepping back from day-to-day practices, we also considered how visualization can be a reflection of worldview. Framing is a critical step for a designer grappling to create a visualization, and our underlying theories of change influence both how we approach and how we talk about data visualization.

Books that capture an entire worldview through visualization. Photo credit: Paul Kahn

What I didn’t hear

Across the entire conference, there was almost no mention of AI. Presenters were definitely using AI technologies for certain kinds of data, but their talks were focused on the output rather than the tools. The one talk focused explicitly on AI considered whether it is helpful to use visualization as an input for AI learning, and what properties of a visualization might make it more interpretable and more useful for training an AI. I’m not sure if that was incidental or intentional, but it was a notable absence when so much of our current discourse is dominated by AI froth.

Reflections to take forward

Coming out of these many conversations, I found myself wondering which of the “theory of change” approaches are most effective, for which audiences, and when. Some speakers mentioned negative receptions: from the CDC when talking about data rhetoric and emotional visualizations, and from institutions of higher education when talking about faculty pay inequity. Many others discussed the tangible impacts of their work in shifting stubborn social and policy problems.

As always, the key lies in consciously framing your data and your analysis: in terms of the context, your purpose, the audience, and the people impacted and involved. Across many projects, we heard designers talk about how to define and redefine the problem as a critical step in getting to insight and achieving a successful design. 

As a designer working in industry to create large platform software, I find that all design often gets simplified to UX. It was nice to step outside of that bubble for a moment and remember the many things that design does, and the different places that designers contribute. I do think there is an interesting conversation to be had between the perspective of creating large-scale tools to structure data exploration for decision making at scale, and the one focused on using bespoke and personalized data visualization for communication—either to or with—an audience once the analysis is complete. 

Many of the unique, nuanced and contextual factors in a dataset can get blurred out when analyzing data at scale, and much of the big picture gets lost when focusing only on the particularities of a specific dataset. And yet, both the large and the contextualized cases come down to helping humans create big-picture conclusions by understanding nuances in the data. Building systems to accommodate large, unwieldy, and heterogeneous datasets to connect across these different scales requires insights from both sides. Perhaps that’s a topic for the next conference.

CategoriesCommunity

The post Info+ appeared first on Nightingale.

]]>
24511
REVIEW: Everyday Data Visualization: A Refreshing Return to Fundamentals https://nightingaledvs.com/review-everyday-data-visualization/ Tue, 13 Jan 2026 15:07:29 +0000 https://nightingaledvs.com/?p=24524 A Zen Buddhist teacher, Suzuki Roshi, famously said, “In the beginner’s mind there are many possibilities, but in the expert’s there are few.” The implication..

The post REVIEW: Everyday Data Visualization: A Refreshing Return to Fundamentals appeared first on Nightingale.

]]>
A Zen Buddhist teacher, Suzuki Roshi, famously said, “In the beginner’s mind there are many possibilities, but in the expert’s there are few.” The implication is that expertise can inadvertently short-circuit creativity and curiosity. The quote rings true for me, resonant with my own occasional surprise at the success of someone’s seemingly off-the-wall data visualization project. Fortunately for us battle-weary data practitioners, the notion of beginner’s mind can be applied to a day as well as a career. I periodically delight in my renewed ability to reframe a problem in an unexpected way in the light of a morning following a satisfying sleep.

Desiree Abbott’s Everyday Data Visualization beckons even the thoroughly Tableau-tested and Power BI-ified among us back to the exhilarating feeling of beginner’s mind. While the book is pitched as a comprehensive introduction for newcomers to the field, experienced practitioners will find unexpected depth in Abbott’s treatment of foundational topics. Her master’s degree in physics brings scientific rigor to subjects like color theory that often receive only superficial treatment in visualization texts.

Color theory worth your time

Chapter 4, “Choosing Colors,” exemplifies what sets this book apart. Abbott doesn’t just tell you to use sequential palettes for ordered data—she explains why, grounding her advice in the mathematics of color spaces and the computational logic of RGB values. Her explanation of hexadecimal color notation transformed what I’d always treated as rote memorization into genuine understanding. She walks readers through why 255 becomes FF in hex notation, connecting bytes, bits, and the fundamental constraints of computer displays to the practical work of choosing colors for a dashboard.

This depth extends to palette selection. Abbott distinguishes between continuous color ramps, stepped versions of continuous palettes, and categorical schemes with precision rarely found in practitioner-oriented texts. Her discussion of when to use divergent palettes—”for continuous data that’s about variation around a meaningful single value”—gave me new language for decisions I’d been making intuitively for years. 

Abbott’s lighthearted writing style keeps even technical material engaging. Her aside on the etymology of “uppercase” and “lowercase”—capital letters stored in the physically upper case of a printing press—exemplifies the “little rabbit holes” that propelled me through chapters I might have otherwise skimmed. She manages to make WCAG accessibility guidelines genuinely interesting, a feat I would not have thought possible.

Accessibility challenges that stick

The accessibility chapter challenged my practice in concrete ways. I hadn’t considered how the hover-based interactions I deploy constantly in both JavaScript and Tableau translate to nothing at all for keyboard-only users. Abbott’s treatment of this issue was neither preachy nor superficial—she provided actionable guidance while acknowledging real-world constraints. This balance characterizes her approach throughout: practical without being prescriptive, thorough without being pedantic.

Project management wisdom

The later chapters on project management offer hard-won wisdom on scope creep and stakeholder management. Abbott’s advice to “be specific nearly to the point of being pedantic when scoping the project” resonates with anyone who’s watched a two-week project balloon into two months (or six!). Her discussion of “too many cooks in the kitchen”—stakeholders who feed off each other’s displeasure and provide contradictory feedback—will strike a chord with consultants and in-house practitioners alike.

Particularly valuable is her advice on building visualizations for data that doesn’t yet exist. Rather than dismissing this as impossible, she provides concrete strategies: generate test data using the actual systems, use random data generators, or even prompt your favorite large language model with specific structural requirements. Her emphasis on “future-proofing” sparse data by leaving adequate space for categories to fill in later addresses a common but rarely discussed challenge.

What’s missing

For all its strengths, the book occasionally sacrifices depth for breadth. Part 1’s survey of visualization history and visual perception covers well-trodden ground without adding substantially to existing literature. Tool-specific guidance is intentionally minimal—Abbott frequently notes that implementation details “depend greatly on the tool you use”—which keeps the book from dating quickly but may frustrate readers seeking copy-and-paste solutions.

The book also assumes readers work primarily with traditional business intelligence tools rather than code-based approaches. Those of us migrating toward D3.js, Observable, or Svelte Plot will need to do our own translation work, though the fundamental principles Abbott articulates transfer readily to any medium.

The verdict

Everyday Data Visualization succeeds precisely because Abbott takes beginners seriously enough to teach them well. In doing so, she’s created a book that rewards careful reading from practitioners at any level. The beginner’s mind, after all, isn’t about knowing less—it’s about remaining open to learning more. Abbott’s book is an invitation to that openness, grounded in scientific rigor and leavened with genuine charm.

Whether you’re onboarding a junior analyst or simply seeking to shore up gaps in your own knowledge, this book deserves a place on your shelf.


Desiree Abbott’s Everyday Data Visualization is available from the publisher and other booksellers, including Amazon.

CategoriesReviews

The post REVIEW: Everyday Data Visualization: A Refreshing Return to Fundamentals appeared first on Nightingale.

]]>
24524
Designing Mars: Transforming Scientific Data Into Human Understanding https://nightingaledvs.com/designing-mars/ Thu, 08 Jan 2026 16:10:00 +0000 https://nightingaledvs.com/?p=24526 Mars is easy to measure but hard to understand. Numbers can describe it, but they don’t bring it closer. My mars data visualization project began..

The post Designing Mars: Transforming Scientific Data Into Human Understanding appeared first on Nightingale.

]]>
Mars is easy to measure but hard to understand. Numbers can describe it, but they don’t bring it closer. My mars data visualization project began with a simple question: what if design could bridge the distance between science and human perception? What if the cold precision of planetary data could be reimagined as a visual story, one that anyone, from a child to a professional, could intuitively grasp?

Translating complexity into clarity

The work began deep within NASA’s archives, climate records, atmospheric compositions, geological mappings, and orbital analyses. Each dataset was vast, intricate, and uninviting. But design doesn’t fear complexity; it reorganizes it. Through visual semiotics, cognitive mapping, color theory, and narrative structure, I translated those scientific abstractions into patterns of meaning. The goal wasn’t to simplify the data, but to find its rhythm, the pulse hidden beneath the graphs.

Design, in this context, became a language of empathy. For children, Mars was reborn through expressive color fields, rounded geometry, and tactile playfulness—visuals that invite curiosity and spark questions rather than deliver answers. For adults, the system evolved into something more meditative: atmospheric gradients, precise linework, and editorial pacing that mirror the cadence of scientific reading. Together, these two design systems form parallel narratives that translate the same planet through different emotional grammars.

Encounters, not charts

Each visualization is not a chart but an encounter—a way to feel the Martian cold, the weight of its thin air, the long patience of its orbit. The data’s story emerges not from what it measures, but from how it’s seen. That is where design’s strength lies: in transforming measurement into experience, precision into perspective.

Design as interpretive intelligence

In a world drowning in information, this project argues for design as a form of interpretive intelligence. Data alone doesn’t make meaning; it waits for design to awaken it. Every chart, map, and dataset is a silent conversation until someone gives it voice—through composition, color, hierarchy, and narrative flow. The designer’s role is not to beautify facts but to translate them into human insight. This act of translation is not purely aesthetic; it’s ethical. It decides what people notice, what they value, what they remember. In that sense, design holds a quiet but immense responsibility, to make truth visible without distortion, to turn knowledge into understanding.

Empathy in the rational

When seen through that lens, Designing Mars becomes less a project about a distant planet and more a reflection on how we process the universe around us. It suggests that empathy can exist in the most rational of fields, that even planetary science can be reimagined through emotion, curiosity, and wonder. The project was recently honored as a Silver Winner at the 2025 Spark Design Awards, a recognition that reinforces its central idea, that the future of information design lies in its ability to move people, not just inform them.

This project stands on a simple belief: design is not an accessory to science, it is the lens that makes science human. When we redesign how information is seen, we also redesign how it is understood. Through deliberate visual language, Mars transforms from a remote field of data into a world we can comprehend, connect with, and imagine. That is the quiet power of design, it doesn’t just clarify information; it brings the universe closer.

The post Designing Mars: Transforming Scientific Data Into Human Understanding appeared first on Nightingale.

]]>
24526
Analytics Products Will Never Be Truly Human-Centered Until the Workplaces Behind Them Are https://nightingaledvs.com/analytics-products-never-human-until-workplaces-are/ Wed, 17 Dec 2025 16:34:49 +0000 https://dvsnightingstg.wpenginepowered.com/?p=24457 I’ve been really excited to see a shift in analytics and business intelligence around more integration of human-centred design, ethics, and accessibility. I learn something..

The post Analytics Products Will Never Be Truly Human-Centered Until the Workplaces Behind Them Are appeared first on Nightingale.

]]>
I’ve been really excited to see a shift in analytics and business intelligence around more integration of human-centred design, ethics, and accessibility. I learn something new almost every day. However, I feel something is still missing from these conversations: whether these are being considered beyond the interface, and in our workplaces too. 

From what I’ve experienced and witnessed working in analytics, I don’t think I see the same strides in how analytics work gets done. For example, how many of us have kept producing while our lives were going through upheaval? How many have wondered if we can stay in our jobs, or even careers, because the way we’re expected to work is unsustainable to our well-being and personal lives? What might happen if we approach our work in a way that decenters speed, volume, and heroics, and recenters all humans involved?

My early days

I discovered data visualization in undergrad while studying cases like the Three Mile Island nuclear accident, where poor information design contributed to near or actual harm. It was one of the first moments in engineering where my ears perked up, especially around how data visualization bridges the analytical, creative, and human.

My early roles in quality improvement in hospitals only deepened that passion. I was fortunate to work alongside clinicians, designers, and researchers who introduced me to co-design methods, the importance of evaluation, and reframed users as collaborators.

Eventually, I landed my first role on an analytics team to support with BI design and development. However, it was during a time when my mom was battling appendix cancer, and I was living at home to support with caregiving. And my passion for this work quickly collided with the realities of how analytics gets done.

Deadlines versus trauma

When my mom was admitted to palliative care a year later, it happened to line up closely with a due date for a “high-stakes” report I was responsible for developing in Tableau, which I was learning how to use on my own. Because of the project’s size and weight, and the responsibility I felt to deliver, I would work a full day, bring my laptop to hospice care, and continue working near her bedside.

I could have asked for an extension or support. However, analytics routinely feels like a pressure cooker, especially on “high-stakes” projects. Plus, my qualifications were openly being questioned by others, I was identified as one of the “single points of failure”, and was also cautioned about the potential for blame if anything went wrong. Stepping away didn’t truly feel like an option – it was easy to feel cornered. On top of that, I was in my twenties, with undiagnosed neurodiversity, zero concept of needs and boundaries, and overwhelmed, confused, and exhausted.

At my mom’s funeral, a colleague asked when I might return to work, and relayed that people were getting anxious about report delivery. 

Her funeral was on a Friday. I went back to work on Monday. I finished developing and testing the report—and from what I remember, everyone received it when expected. 

I’m not sure if it felt like “a win” for me. It made me question, how are analytics workers perceived? And, what did I just do? 

Breaking points

The elements of that experience were not isolated to any individuals, teams, or organization, but recurring threads I’ve encountered and witnessed time and time again as my career in analytics has progressed. 

Fast-forward many years later to a more recent contract, again as a BI designer and developer, where layers of challenging, but common, systemic pressures rattled my nervous system. I eventually had a major Autistic shutdown (an involuntary neurological response to sensory overload), and needed to leave.

I’ve listed some of the challenges below – do any of these resonate, neurodiverse or not?

Structural

  • Unclear or missing roles, scoping, processes, and standards
  • Unrealistic expectations around task complexity and timelines
  • Unpredictability requiring frequent context switching and quick adaptation to change

Cultural/interpersonal

  • Persistent state of urgency, with hustle and “just get it done” culture
  • Lack of autonomy and space, with ongoing progress checks and pressure points
  • Repeatedly having to overexplain, raise concerns, and justify boundaries 
  • Interdepartmental conflict and tension
  • Feeling held responsible for the success of the project

Environmental

For this experience, I was able to be fully remote. From research and my own previous jobs, I know several factors that can be challenging with in-office environments for Autistic workers. These can include adherence to a 9 – 5 schedule, open concept office spaces with bright lighting and noise, and pressure to attend social functions. 

When layers like these start to compound, my nervous system gets flooded with input and demands, and can’t catch up. I get stuck in survival mode, and eventually break or shut down. Autistic burnout can look very different from our typical understanding of burnout, and recovery can require weeks to months (or even years) of deliberate care. Just to note, other Autistic people may have different experiences, supportive conditions, and responses – these are just my own. aces with bright lighting and excessive noise, constant interruptions, and pressure to attend social functions.

Figure 1. Examples of supportive conditions for Autistic employees from a 2023 report by Autism Alliance Canada. It is important to note that Autistic employees and employers can work together to identify the supports that might work best.

At this point, I’m afraid of returning to analytics as it currently exists. It can feel inaccessible to neurodivergence, and unforgiving to responsibilities outside of work. But am I the only one who feels this way? 

Ripple effects: Tired teams, leaders, products, and users

From what I’m seeing across industry research, I don’t think I’m the only one finding this field challenging and unsustainable. Here are some highlights:

Data teams are already overcapacity, despite ever-growing demands

In a 2023 survey of more than 900 data team practitioners and leaders across the United States and the United Kingdom, 84% said their workload exceeded their capacity, and 90% reported that it had increased from the year prior.

The vast majority of data engineering teams feel burnt out 
Another survey of over 600 data engineers and managers found that nearly all of them (97%) reported feeling burnt out, primarily due to time spent fixing errors, maintaining data pipelines, and constantly playing catch-up with stakeholder requests. Nearly 90% reported frequent work-life disruptions. 70% said they were likely to leave their current company within a year, and almost 80% were considering leaving the field altogether.

Figure 2. Experiences and impacts of data analytics work on data engineers from a 2021 report by data.world and DataKitchen.

“When a deliverable is met, data engineers are considered heroes. However, “heroism” is a trap. Heroes give up work-life balance. Yesterday’s heroes are quickly forgotten when there is a new deliverable to meet.”

2021 Data Engineering Survey: Burned-out Data Engineers Call for DataOps

Analytics products aren’t sufficiently supporting our end users

In a 2025 survey of more than 200 product leaders, data teams, and executives, 40% said their data doesn’t support decision-making sufficiently, 51% can’t meaningfully interact with the data provided, and 29% export data to spreadsheets daily. 

Findings I’m not surprised to see, considering how we’re expected to work. From a design perspective, it can be a struggle to carve time and space to sufficiently understand the data and users before I’m asked to quickly turnaround a prototype. Plus, post-launch follow-up and evaluations don’t seem to gain traction before we’re onto the next priority.

We’re hoping AI will save us

In the same survey as above, 75% believe AI-powered analytics might finally help uncover value buried in data. But in a new study by MIT and Snowflake, 77% of data engineering teams are finding their workloads even heavier, despite AI integration. 

While AI has the potential to streamline tasks and improve product quality, a cracked foundation could limit its impact, and cause further complexity and burnout. 

Figure 3. Examples of external and internal pressures in analytics, as well as possible outcomes.

Diverse does not equal inclusive

In analytics, we often point to diversity as evidence that we’re on the right path. When concerns are raised about how pressures, workloads, and expectations may weigh differently across identities, they can be dismissed with the reassurance that our workplaces are “already pretty diverse.”

That might be partially true in terms of representation. A recent study by Statistics Canada showed that 60% of data scientists (one of many roles within analytics) are immigrants, with the majority of first languages being neither English nor French. About one-third of data scientists identify as women+ (defined by the study to include “women and some non-binary people”). 

It is important to recognize that diversity does not always equal inclusion. In other pieces published by Nightengale, Catherine D’Ignazio and Lauren F. Klein, authors of Data Feminism, speak to how racism and sexism are imbued in the end to end data lifecycle, reinforced by structures of power, and ultimately surfacing in our products. An online poll by Christian Osborne showed that 90% of respondents said that they’ve experienced microaggressions at work, which can cause emotional and psychological harm, decrease job satisfaction, and increase turnover. 

We can also be sensitive to trends across all workplaces. In 2024, the Diversity Institute, Future Skills Centre, and Environics Institute for Survey Research published a Canada-wide study on gender, diversity, and discrimination at work. The survey reinforces that workplace discrimination is more likely to be experienced by racialized and Indigenous peoples, women, persons with disabilities, 2SLGBTQ+ individuals, and young adults. It is crucial to recognize that intersectionality amplifies these effects, with racialized and Indigenous people more likely to face multiple forms of discrimination, especially related to gender, age, and disability. And, those who reported experiencing discrimination also reported poorer mental health. 

Even with diversity, we still need to ensure that our analytics workplaces make everyone feel safe, healthy, empowered, and valued. Diversity, equity, and inclusion (DEI) programming remains urgent and necessary, and should not be deprioritized or defunded. In the systemic pressures previously discussed, I wonder how these are felt across different identities. For example – what are the experiences of a woman in a leadership role, a recent immigrant who is supporting family both at home and overseas, or a new grad with one or more disabilities – are they really all the same?

What if we worked differently, and prioritized people first?

The tendency for analytics workplaces to be top-down, reactive, chaotic, transactional, and overburdening clearly isn’t working—not for our people, and not for our products. We’ve got more than enough burned out workers and leaders, and more than enough underused products to prove it. And I’m only seeing signs that analytics (and tech more broadly) might be becoming even more unsustainable—from 996 culture, mandatory RTO policies, pressure to upskill for AI, low data readiness for AI, to the defunding of DEI.

I think systemic change (or a reset button) is required to humanize our approach to analytics work. The shift has to include not only analytics teams, but also the ecosystems that rely on us. 

For example, earlier this year, the Canadian Occupational Health and Safety Magazine suggested that workplaces adopt a trauma informed care (TIC) approach to work. This approach places safety, trust, and empowerment at the center, and recognizes that many of us have experienced trauma—trauma that workplaces can trigger, perpetuate, or even create. Normalized approaches to analytics work can actually be quite harmful, like unpredictability, constant urgency, ambiguity, and the erosion of autonomy. 

The article references the six pillars of TIC laid out by the Substance Abuse and Mental Health Services Administration (SAMHSA), and cites research that shows its positive impacts to employee well-being, satisfaction, retention, operational functionality and effectiveness, and cost efficiency. 

Figure 4. Six key principles of a trauma-informed approach, published by the Substance Abuse and Mental Health Services Administration (SAMHSA).

I have listed the six pillars from SAMHSA below, along with my attempt at (extremely) high-level and brief descriptions tailored to those of us working in analytics. I am still on my own learning journey. 

  1. Safety: Prioritize physical and psychological safety in all elements of the workplace. In analytics, this can mean that people are able to seek clarity, name concerns, and admit uncertainty without fear of punishment or loss of credibility. It can also mean that we respect limits on things like working hours, cognitive load, personal space, and sensory needs.
  2. Trustworthiness and Transparency: Build trust through consistent transparency around decisions, timelines, priorities, and changes. Clarity and predictability can reduce uncertainty, prevent reactivity, and stabilize teams.
  3. Peer Support: Reduce isolation and barriers to connection to foster peer support within and across teams. This can allow for greater understanding across disciplines and parts of the organization, smoother workflows, supportive relationships, shared problem-solving, and better knowledge transfer.
  4. Collaboration and Mutuality: Involve workers in decisions about policies, procedures, tools, standards, and more. Also, when business units and analytics teams better understand each other’s capacities, workflows, complexities, timelines, needs, etc., collaboration might be more smooth, respectful, and productive. 
  5. Empowerment, Voice, and Choice: Choice and control are essential for trauma-impacted people. In analytics, empowerment could mean giving workers more agency in defining things like their own scope, workflows, documentation, timelines, training needs, and work arrangements.
  6. Cultural, Historical, and Gender Sensitivity: Address systemic inequities and promote diversity, equity, and inclusion. Design systems from the start to acknowledge, understand, and respect differences. Do not rely on people to constantly identify, overexplain, or advocate for their needs.

Integrating TIC is a deep, long-term commitment that isn’t about checking boxes, a quick workshop, or adding a few supportive practices. It requires honest and sustained cultural and structural assessments, learning, planning, and shifts, and a more balanced distribution of power. But with a new reframing, maybe we can begin to view:

  • Workers as human, collaborators, creators, and both autonomous and interdependent 
  • Leaders as human, coordinators, facilitators, coaches, guides, and anchors
  • Work as collective, learning, growth-oriented, and sustainable 
  • Technology as supportive, enhancing, synchronizing, and shared 

This isn’t meant to be a silver bullet, and I know there are many other challenges in analytics that involve data, tools, processes, and more. It may also seem overly idealistic in our current systems. But I feel like tech is at a precipice, especially in the rush toward AI creation and adoption. We’re already seeing increased exploitation of labour and the environment in the AI space, without consideration of short or long term consequences. If we don’t care to stop and make our systems more sustainable, ethical, equitable, and accessible now—what does this mean for our (very near) future? 

I’m curious about what a different approach to analytics work might bring:

  • Will we have the space to maintain our health, relationships, and lives outside of work?
  • Will relationships within and between teams become more stable, empathetic, and productive—especially between analytics and business units?
  • Will we have more space in between deliverables to recover, reflect, and refine our systems?
  • Will our products become clearer, more cohesive, more aligned, actually used, and have impact?
  • Will we feel safe and supported to show up at work in our own unique ways?

The post Analytics Products Will Never Be Truly Human-Centered Until the Workplaces Behind Them Are appeared first on Nightingale.

]]>
24457
From Metrics to Mood: The Emotional Story in A HYROX Race https://nightingaledvs.com/from-metrics-to-mood/ Tue, 02 Dec 2025 16:43:20 +0000 https://dvsnightingstg.wpenginepowered.com/?p=24449 In the world of sports performance, data is everywhere. Watches track heart rates, apps monitor recovery, and race platforms log every split and second. But..

The post From Metrics to Mood: The Emotional Story in A HYROX Race appeared first on Nightingale.

]]>
In the world of sports performance, data is everywhere. Watches track heart rates, apps monitor recovery, and race platforms log every split and second. But when all that data is condensed into a single visual, a story emerges: the numbers stop being neutral—they speak with raw emotion.

The aim was to analyse my HYROX performance, which is the fast-growing hybrid fitness event. It combines eight 1-kilometre runs with functional workout stations like sled pushes, burpees, and wall balls. The race’s structure naturally lends itself to analysis—clear segments, repeated runs, and measurable transitions. The goal of the visualisation was to explore how time, effort, and physiology interact across a physically demanding event. What actually transpired was a visualisation that was much more emotive: projecting personal emotion, or how I felt about my performance.

The challenge of condensation

Athletic data is inherently multidimensional. Time, effort, and physiology interact in ways that are complex and deeply human. Condensing all that into a single visual means facing the same challenge every visualisation designer knows too well: what to keep, what to simplify, and what to discard.

This HYROX chart condensed over an hour of physical effort into a few compact panels. Rather than presenting the bars along the conventional layout (x-axis), I shaped the visual to mirror the race’s own rhythm. As the reader moves from left to right, the reader too moves through each run and station. As the viewer follows the visual rhythm of the page and reaches the second chart from the top, they uncover time spent at each station relative to the event average—a clear indication of where momentum built or faded. Green meant faster than average, red meant slower. A cumulative line showed the overall trajectory: moments of acceleration, versus pauses of fatigue relative to the average athlete.

Design-wise, it worked. The streaks of green—for the lunges and sled pull stations—sparked a sense of pride. But as soon as I saw that one bar of deep red—the dreaded wall balls—I didn’t just see inefficiency; I felt disappointment. That’s when I realised how much emotional weight colour can carry in performance visualisation.

When color becomes judgement

It’s clear that colour can convey emotion. Warm hues suggest intensity, fatigue, or struggle, while cool tones evoke calm and control. These associations can subtly influence how athletes perceive their own performance. By using warm reds to mark high heart rate zones and difficult stations, and cool greens to indicate easier segments relative to the average, the visualisation established an intuitive “moral language”: a clear visual distinction between stronger and weaker performance that made the data instantly readable.

This raises a key design question: when visualising personal performance, are we aiming to motivate—or simply to measure? Should a chart make the athlete feel proud, or precise? The answer likely lies somewhere in between. The top chart, rendered in a calm blue gradient, remains neutral: it measures output without judgment. The chart below leans into emotion, using contrast and colour to spotlight effort and highlight moments of struggle.

Rhythm, not just metrics

The bottom half of the visualisation traced my heart rate throughout the race, capturing the ebb and flow of effort across running segments and workout stations. The rising and falling bands of orange and red felt like a heartbeat for the race itself—a pulse that mirrored moments of endurance, bursts of strain, and brief windows of recovery.

It wasn’t just data on a page; it was a rhythm you could feel. Peaks were sudden surges of intensity, while valleys were respites and recovery. Each station became a note in a composition of exertion and relief. In this way, visual structure itself conveyed effort before any labels or numbers were read. As designers, we often obsess over precision, but here, pacing and tempo communicated the human experience of performance more viscerally than any raw statistic ever could.

From metrics to meanings

What I learned from visualising my HYROX race wasn’t just where I was fast or slow, but how visualisation framed that story. Choices of colour, alignment, and context turned raw numbers into something interpretive—something emotional.

For data visualisation practitioners, that’s a valuable reminder: the goal isn’t only to display information, but to mediate understanding. The way we design a visual can shape not only what people learn, but how they feel about what they learn.

The post From Metrics to Mood: The Emotional Story in A HYROX Race appeared first on Nightingale.

]]>
24449
In the Shadow of Edmund Halley: Solar Eclipses, Citizen Science, and Qualitative Dataviz https://nightingaledvs.com/in-the-shadow-of-edmund-halley/ Wed, 19 Nov 2025 16:08:08 +0000 https://dvsnightingstg.wpenginepowered.com/?p=24423 On April 8, 2024, a total solar eclipse crossed North America from the Pacific Coast of Mexico to the island of Newfoundland, off the eastern..

The post In the Shadow of Edmund Halley: Solar Eclipses, Citizen Science, and Qualitative Dataviz appeared first on Nightingale.

]]>
On April 8, 2024, a total solar eclipse crossed North America from the Pacific Coast of Mexico to the island of Newfoundland, off the eastern coast of Canada. At its longest point, in the center of totality, the Moon covered the Sun for exactly four minutes and 28 seconds. 

In the months leading up to the 2024 eclipse, experts predicted that millions of people would migrate to the path of totality to witness this extraordinary event. Tiny towns across the continent braced themselves for tourists, advising residents to stock up on food and gasoline in case of shortages. Highway signs warned travelers to prepare for extended delays. Some people who lived on the edge of totality drove two or three hours from their hometown just to experience one extra minute of darkness.

Part of the beauty of a modern solar eclipse—indeed, the only thing that makes it possible to travel to the center line—is that we understand the science behind the phenomenon. Knowing exactly where and when the darkness will hit, we can anticipate it with excitement and pleasure.

Among ancient people, for whom the sudden disappearance of the Sun provoked fear and dread, those four minutes could not have passed more slowly.

More than three centuries ago, in 1715, another solar eclipse hit the scene smack-dab in the middle of the Age of Enlightenment. Just two decades earlier, Isaac Newton had published his Principia, ushering in the eponymous era of Newtonian physics. The Sun, Moon, and stars—once seen as mystical celestial bodies—had been reduced to mere balls of rocks and gas, subject to the same laws of motion and gravity as the rest of us on Earth.

Newton set in motion a reshaping of the universe: from a mysterious, unknowable cosmos into one governed by data. With enough data points, early Enlightenment thinkers hypothesized they could anticipate the future movements of every object in the universe.

The 1715 solar eclipse was noteworthy in many respects. It was the first eclipse to pass over London, England in more than 500 years. It was the first time the path of totality could be mapped in advance thanks to the new laws of astronomy and physics. It was, therefore, the first eclipse to attract tourists. And the first to inspire scientific investigation. 

Astronomer Edmund Halley, most famous for his discovery of Halley’s Comet, was also a data visualization pioneer. He published the world’s first weather map, which depicted trade and monsoon wind patterns across the globe and was subsequently used by sailors as a navigational tool. He is also recognized as the first to plot two variables against each other on a Cartesian plane (as seen in his bivariate plot of barometric pressure and altitude) and the first to use contour lines on maps.

Halley saw the upcoming solar eclipse as a chance to test out Newton’s theories of gravity and motion. He published a pamphlet that claimed the darkness was neither an evil omen nor a divine event, but in fact the “necessary result of the Motions of the Sun and Moon.”

Halley’s pamphlet included a map that depicted the path of totality as seen from above—the first of its kind ever recorded and one which sparked a “golden age of eclipse maps.” 

Halley also kicked off the first citizen science project in modern history. In his pamphlet, he addressed the “Curious” people of England, urging them to watch the sky during the eclipse and record their observations: “The Curious are desired to Observe it, and especially the duration of Total Darkness, with all the care they can; for therby [sic] the Situation and dimensions of the Shadow will be nicely determin’d…”

In the end, about 25 people answered Halley’s call, sending him the times that totality began and ended in their specific location, along with a short description of what they saw in the sky. Halley himself wrote about his own experience in a mix of both scientific and poetic observations: “by Nine of the Clock . . . the Face and Colour of the Sky began to change from perfect serene azure blew [sic] to a more dusky living Colour having an eye of Purple intermixt, and grew darker and darker till the total Immersion of the Sun…”

Halley used the data he collected to correct the path of totality on his map, setting the stage for countless future scientists and eclipse chasers.

Leading up to the 2024 total solar eclipse, I prepared myself as best I could. I booked a weekend cabin along the path of totality, bought eclipse glasses for my whole family, and stocked up on Moon Pies, Sun Chips, and Cosmic Brownies. I vowed not to take pictures during totality, desiring instead to stay fully present and “in the moment.” After all, the eclipse would likely be the most photographed astronomical event in human history; there would be plenty of opportunities to download iconic images later.

But nothing could have prepared me for the experience of totality: four minutes of darkness, of disorientation, of complete awe and wonder. Four minutes of walking a strange, fine line between science and mysticism. Four minutes of feeling connected to birds and squirrels, to everyone else who was watching the sky at the same moment, and even to the ancient Vikings, who believed eclipses resulted from a monster devouring the Sun.

I took pictures, of course: terrible, blurry, amateur shots from my iPhone. I couldn’t stop myself—I felt an overwhelming compulsion to capture the strange sights and sounds around me and to document that I was there

Afterward, I couldn’t help but wonder whether other people felt that same sense of connection… or that same compulsion to take pictures. These weren’t questions of physical science, of course; nonetheless, they were questions that could be answered with data. Following in the footsteps of Edmund Halley, I sent out a call on social media, asking people to share their own photos and stories from the eclipse. Naively, optimistically, I hoped to receive hundreds, if not thousands of responses. But I’m no great social media influencer, and after posting my Google Form link everywhere I could imagine, I ended up with 62 responses—a tiny fraction of the total population who watched the eclipse. But to my delighted surprise, they represented a broad swath of locations along the path of totality and contained all the depth and complexity of a strong qualitative dataset.

Image provided by the author.

Initially, I created a Google map of the responses I received, a nod to Edmund Halley’s original visualization. But I couldn’t help but wonder if there might be a different way to present the data, one that might capture what the experience felt like.

So, I set out to analyze the rich mix of words and images that comprised my dataset. Using poetic inquiry, a qualitative process developed in the 1970s by multiculturalist and feminist researchers, I engaged in thematic analysis of respondents’ written submissions. A few themes that emerged in this process included feelings of transcendence (including connectedness to nature, humanity, and God), descriptions of the weather (especially the cool temperatures that accompanied the darkness), changes in animal behavior (dogs barking, birds roosting), and a communal feeling of celebration (gathering, cheering, public festivities). I highlighted certain “poetic turns of phrase” that appeared in participants’ responses; then I cut and pasted words and phrases to create 10 found poems that each represented a shared theme from participants’ experiences. (A condensed version of the poems, entitled “Six Ways to View an Eclipse,” appears in the online literary journal Unlost). 

I also coded the photos that I received. Most people submitted some version of the Moon covering the Sun during totality; these photos were coded based on the size of the Moon, whether it was in the foreground or background, and what other elements appeared in the photo (such as people, buildings, or trees). Some photos depicted a photo from before or after totality, featuring a “crescent sun,” and a few photos included people without the Sun or Moon appearing at all. In the end, I selected 20 photos that collectively showcased all the different visual elements that appeared in the dataset.

Images provided by the author.

In thinking about how to visualize this data, I wanted to create an opportunity for viewers to interact with the photos and poems in a novel way. After brainstorming several installation ideas with the team at Fusiform Props and Exhibits, I finally settled on the idea of printing the photos and poems using a special technique called lenticular printing. Lenticular printing is a technology that uses plastic lenses with ridges on top to display multiple, interlaced images at one time. The different images float in and out of visibility, depending on the angle from which the print is viewed.

Each of the final lenticular prints consisted of two photos and one poem, thereby displaying the words and images from multiple participants at one time. From April to June of 2025, the 10  prints appeared as part of a larger exhibition, entitled “Data Is Poetry,” at Artspace in Shreveport, LA.

During the opening reception, I watched as people walked past the prints on the wall. At first, most people strolled past casually at first, then did a double-take after realizing that the prints contained “hidden” images and words. They proceeded to adjust their own position, moving forward, backward, and side to side as they tried to see (and read) all the layers in the image. 

I was reminded of my own experience from a year earlier and how earnestly I had watched the sky through my eclipse glasses, looking for the slightest changes in the Sun as the Moon passed in front of it. The data visualization, therefore, mirrored the eclipse itself—an astronomical phenomenon that shifted with mathematical precision based on angles and movement. 

But the visualization also effectively symbolized our shared experience of the eclipse. Though all of the participants in the project had shown up for the same event, their view was necessarily determined, and limited, by their specific location and context. Only by compiling multiple viewpoints could we see the composite: a collective phenomenon that was as human as it was cosmic.

The post In the Shadow of Edmund Halley: Solar Eclipses, Citizen Science, and Qualitative Dataviz appeared first on Nightingale.

]]>
24423