MERL Tech News

Please Submit Session Ideas for MERL Tech Jozi

We’re thrilled to announce that we’re organizing MERL TEch Jozi for August of 2018!

Please submit your session ideas or reserve your demo table now, to explore what’s happening with innovation, digital data, and new technologies across the monitoring, evaluation, research, and learning (MERL) fields.

MERL Tech Jozi will be in Johannesburg, South Africa, August 1-2, 2018!

At MERL Tech Jozi, we’ll build on earlier MERL Tech conferences in DC and London, engaging 100 practitioners from across the development and technology ecosystems for a two-day conference seeking to turn theories of MERL technology into effective practices that deliver real insight and learning in our sector.

MERL Tech is a lively, interactive, community-driven conference.  We’re actively seeking a diverse set of practitioners in monitoring, evaluation, research, learning, program implementation, management, data science, and technology to lead every session.

Submit your session ideas now.

We’re looking for sessions that focus on:

  • Discussions around good practice and evidence-based review
  • Innovative MERL approaches that incorporate technology
  • Future-focused thought provoking ideas and examples
  • Conversations about ethics, inclusion, and responsible policy and practice in MERL Tech
  • Exploration of complex MERL Tech challenges and emerging good practice
  • Workshop sessions with practical, hands-on exercises and approaches
  • Lightning Talks to showcase new ideas or to share focused results and learning
Submission Deadline: Friday, March 31, 2018.

Session submissions are reviewed and selected by our steering committee. Presenters and session leads will have priority access to MERL Tech tickets. We will notify you whether your session idea was selected in late April and if selected, you will be asked to submit the final session title, summary and detailed session outline by June 1st, 2018

If you’d prefer to showcase your technology tool or platform to MERL Tech participants, you can reserve your demo table here.

MERL Tech is dedicated to creating a safe, inclusive, welcoming and harassment-free experience for everyone through our Code of Conduct.

MERL Tech Jozi is organized by Kurante and supported by the following sponsors. Contact Linda Raftree if you’d like to be a sponsor of MERL Tech Jozi too.

 

 

 

MERL Tech London 2018 Agenda is out!

We’ve been working hard over the past several weeks to finish up the agenda for MERL Tech London 2018, and it’s now ready!

We’ve got workshops, panels, discussions, case studies, lightning talks, demos, community building, socializing, and an evening reception with a Fail Fest!

Topics range from mobile data collection, to organizational capacity, to learning and good practice for information systems, to data science approaches, to qualitative methods using mobile ethnography and video, to biometrics and blockchain, to data ethics and privacy and more.

You can search the agenda to find the topics, themes and tools that are most interesting, identify sessions that are most relevant to your organization’s size and approach, pick the session methodologies that you prefer (some of us like participatory and some of us like listening), and to learn more about the different speakers and facilitators and their work.

Tickets are going fast, so be sure to snap yours up before it’s too late! (Register here!)

View the MERL Tech London schedule & directory.

 

DataDay TV: MERL Tech Edition

What data superpower would you ask for? How would you describe data to your grandparents? What’s the worst use of data you’ve come across? 

These are a few of the questions that TechChange’s DataDay TV Show tackles in its latest episode.

The DataDay Team (Nick Martin, Samhir Vasdev, and Priyanka Pathak) traveled to MERL Tech DC last September to ask attendees some tough data-related questions. They came away with insightful, unusual, and occasionally funny answers….

If you’re a fan of discussing data, technology and MERL, join us at MERL Tech London on March 19th and 20th. 

Tickets are going fast, so be sure to register soon if you’d like to attend!

If you want to take your learning to the next level with a full-blown course, TechChange has a great 2018 schedule, including topics like blockchain, AI, digital health, data visualization, e-learning, and more. Check out their course catalog here.

What about you, what data superpower would you ask for?

 

Self-service data collection with the most vulnerable

This is a summary of a Lightning Talk presented by Salla Mankinen, Good Return, at MERL Tech London in 2017. 

When collecting data from the most vulnerable target groups, organizations often use methods such as guesstimating, interviewing done by enumerators, SMS, or IVR. The organization Good Return created a smart phone and tablet app that allowed vulnerable groups to interact directly with the data collection tool, without training or previous exposure to any technology.

At MERL Tech London in February 2017, Salla Mankinen shared Good Return’s experiences with using tablets for self-service check in at village training centers in Cambodia.

“Our challenge was whether we could have app-based, self-service data collection for the most vulnerable and in the most remote locations,” she said. “And could there be a journey from technology illiteracy to technology confidence” in the process?

The team created a voice and image based application that worked even for those who had little technology knowledge. It collected data from village participants such as “Why did you miss the last training session?” or “Do you have any money left this week?”

By the end of the exercise, 72% of participants felt confident with the app and 83% said they felt a lot more confident with technology in general.

Watch Salla’s presentation here or take a look at her slides here!

Register now for MERL Tech London, March 19-20, 2018!

Moving from “evaluation” to “impact management”

by Richa Verma, Resident Entrepreneur at Social Cops. This post originally appeared on the Social Cops blog on August 28, 2017.

When I say that Impact Evaluation is history, I mean it. Some people will question this. After all, Impact Evaluation just became mainstream in the last decade, driven by great improvements in experimental design methods like randomized control trials (RCTs). So how can I say that it’s already a thing of the past? It’s not Impact Evaluation’s fault. The world changed.

Methodologies like RCTs came from medical science, where you can give patients a pill and assess its impact with randomized trials. However, development is not a space where one pill will work for everyone. In development, the patients change faster, the illness evolves faster, and the pill needs to keep pace with both the patients and the illness. That’s where Impact Management comes in.

What Is Impact Management?

New Philanthropy Capital‘s 2017 Global Innovation in Measurement and Evaluation Report counts Impact Management as one of the top 7 innovations of 2017.

So what is Impact Management? Let me first explain what it is not. It’s not a one-time evaluation. It’s not collecting data for answering a limited set of questions. It’s not a separate activity from your program. It’s not just monitoring and evaluation.

It’s a way of making data-driven decisions at every step of your program. It’s about keeping a pulse on your program every day and finding new questions to answer, rather than just focusing on specific questions predetermined by your monitoring and evaluation team or funders.

“The question that’s being asked more and more is, ‘How does evaluation feed into better management decisions?’ That’s a shift from measurement of impact, to measurement for impact.”
– Megan Campbell (Feedback Labs)

How Does Impact Management Work?

Impact Management uses the basic components of monitoring and evaluation, but with an outlook shift. It involves frequent data collection, regular reporting and monitoring of your data, and iteratively updating your program indicators and metrics as data comes in and the program changes.

Impact Management differs from Impact Assessment in that it promotes course correction on a daily basis. Organizations collect data on their programs as they conduct activities, analyze that information on a regular basis, and make changes to the program.

With an outlook that encourages frequent changes, as if you were trading in stocks, organizations will have the ability to A/B test their programs with real-time data to make decisions immediately; rather than wait to compare and contrast two different surveys. They can test out new things and make changes as they receive data in servers, even at the end of the day rather than waiting for the official year-end review. It becomes a way of deciding how they should execute a program daily rather than only seeing strategic changes through.

“[Data collection] should be ongoing — it’s a value driver not a compliance requirement.”
– Tom Adams (Acumen)

In many ways, this is how decisions are made on Wall Street or Dalal Street in India. Analysts don’t wait until the end of the year to make investments by reviewing annual reports. They watch daily as the market fluctuates and strike as soon as they see new potential.

Impact Management works exactly the same. You should strive to increase your impact as soon as opportunity arrives, rather than waiting for a year-end external evaluation or approval.

How Can You Implement Impact Management?

To make Impact Management possible, switch from static data files to a flexible data system.

Today, most of your program officers and even your beneficiaries are armed with mini-computers in their pockets (read: smartphones). Leverage these to create a network of data ingestion devices, continuously tracking and measuring the impact of your programs. Use mobile data collection apps to add forms, deploy them to the field, and reach out not just to your field force but also your beneficiaries — not just at the end of the month or quarter, but as frequently as possible.

Then don’t let this data sit in Excel files. Use today’s technologies to create your own data management system, one that will link your beneficiaries, connect your programs, and answer queries. Have someone with an analytical bent look at this data regularly, or draw on machine power to analyze this data and generate meaningful insights or reports in real time.

“We’re moving away from a static data world, where you work on datasets, and you write reports, to a dynamic data world where data is always being generated and created and it helps you do your job better.”
– Andrew Means (beyond.uptake)

Lastly, it’s crucial to tie this flexible data system back to your decisions. Make real-time data — rather than guesses or last year’s data — the basis of every program decision and the foundation of even weekly catch-ups. And don’t hesitate to test out new things. Data will tell you whether something worked or not.

Many of our partners are using our platform to make Impact Management possible and track their programs in real time. The platform lets them create and tweak data collection forms, and monitor incoming data in real time on their computer, in regular reports, or even on map-based dashboards. They are asking new questions about how their programs are doing and answering them with data.

If we really want to create the best development programs, we’ll have to think differently and use evidence not just once every month or year, but as we make crucial decisions every day. All backed by the tenets of Impact Management: test, fail, improve, repeat.

Join us at MERL Tech London on March 19-20 – where we’ll be debating this topic!

MERL Tech 101: Google forms

by Daniel Ramirez-Raftree, MERL Tech volunteer

In his MERL Tech DC session on Google Forms, Samhir Vesdev from IREX led a hands-on workshop on Google Forms and laid out some of the software’s capabilities and limitations. Much of the session focused on Google Forms’ central concepts and the practicality of building a form.

At its most fundamental level, a form is made up of several sections, and each section is designed to contain a question or prompt. The centerpiece of a section is the question cell, which is, as one would imagine, the cell dedicated to the question. Next to the question cell there is a drop down menu that allows one to select the format of the question, which ranges from multiple-choice to short answer.


At the bottom right hand corner of the section you will find three dots arranged vertically. When you click this toggle, a drop-down menu will appear. The options in this menu vary depending on the format of the question. One common option is to include a few lines of description, which is useful in case the question needs further elaboration or instruction. Another is the data validation option, which restricts the kinds of text that a respondent can input. This is useful in the case that, for example, the question is in a short answer format but the form administrators need the responses to be limited numerals for the sake of analysis.

The session also covered functions available in the “response” tab, which sits at the top of the page. Here one can find a toggle labeled “accepting responses” that can be turned off or on depending on the needs for the form.

Additionally, in the top right corner this tab, there are three dots arranged vertically, and this is the options menu for this tab. Here you will find options such as enabling email notifications for each new response, which can be used in case you want to be alerted when someone responds to the form. Also in this drop down, you can click “select response destination” to link the Google Form with Google Sheets, which simplifies later analysis. The green sheets icon next to the options drop-down will take you to the sheet that contains the collected data.

Other capabilities in Google Forms include the option for changing the color scheme, which you can access by clicking the palette icon at the top of the screen. Also, by clicking the settings button at the top of the screen you can limit the response amount to restrict people’s ability to skew the data by submitting multiple responses, or you can enable response editing after submission to allow respondents to go in and correct their response after submitting it.

Branching is another important tool in Google Forms. It can be used in the case that you want a particular response to a question (say, a multiple choice question) to lead the respondent to another related question only if they respond in a certain way.

For example, if in one section you ask “did you like the workshop?” with the answer options being “yes” and “no,” and if you want to know what they didn’t like about the workshop only if they answer “no,” you can design the sheet to take the respondent to a section with the question “what didn’t you like about the workshop?” only in the case that they answer “no,” and then you can design the sheet to bring the respondent back to the main workflow after they’ve answered this additional question.

To do this, create at least two new sections (by clicking “add section” in the small menu to the right of the sections), one for each path that a person’s response will lead them down. Then, in the options menu on the lower right hand side select “go to section based on answer” and using the menu that appears, set the path that you desire.

These are just some of the tools that Google Forms offers, but with just these it is possible to build an effective form to collect the data you need. Samhir ended with a word of caution that Google has been known to shut down popular apps, so you should be wary about building an organization strategy around Google Forms.

Gender-Based Violence Information System Design

by Stacey Berlow of Project Balance. Stacey co-facilitated a session on a Gender Based Violence Information System in Zambia at MERL Tech DC.

John and Stacey croppedA big thank you to our client, World Vision, and to Yeva Avakyan, Head of Gender and Inclusion at World Vision USA, for inviting Project Balance to participate in the recent panel sessions at MERL Tech and InterAction.

We participated on the panels with World Vision colleagues Holta Trandafili, Program Quality Specialist and John Manda, Senior Monitoring and Evaluation Technical Advisor Zambia last week (September 6-8th). The sessions received a lot of great feedback and participation. Yeva and her team put together this impactful video about the prevalence of gender based violence in Zambia and how One Stop Centers provide needed services to women and men who experience violence in their lives. World Vision works in close collaboration with the Zambian government to roll out gender based violence support services.

The Zambian 2014 statistic are compelling:

  • 37% experienced physical violence within the 12 months prior to the survey.
  • 43% of women age 15-49 have experienced physical violence at least once since age 15
  • 47% of ever-married women age 15-49 report ever having experienced physical, sexual, and/or emotional violence from their current or most recent partner
  • 31% report having experienced such violence in the past 12 months.
  • Among ever-married women who had experienced IPV in the past 12 months, 43% reported experiencing physical injuries.
  • 10% of women reported experiencing violence during pregnancy.
  • 9% of women who have experienced violence have never sought help and never told anyone about the violence.

The drivers of GBV include:

 

  • Norms that teach women to accept and tolerate physical violence, and teach men that it is normal to beat his wife.
  • Extreme poverty, high levels of unemployment
  • Women’s extreme economic dependence on men
  • Socialization practices of boys and girls in schools and the community
  • Sexual cleansing practices
  • Belief that having sex with a child who is a virgin will cure HIV/AIDS
  • Initiation ceremonies that encourage young women to be submissive
  • Forced early & child marriage

An occasionally connected system was built that allows facilities to enter and save data as well as run reports locally and when there is an internet connection, the data automatically synchronizes to a central server where data from across all facilities is available for reporting.

GBVIMS_System_Setup

A participant asked John how data collected was used and if technology positively impacted the program. John gave some concrete examples of how the data showed differences in the number of people receiving certain types of services between facilities and regions. The Zambian team asked “Why the differences?”. This led to analysis of processes which have been adjusted so that survivors can receive medical and psychological help as soon as possible. The technology allows trends to be identified earlier through automated reporting rather than having to hand calculate indicators at the end of each reporting period. As the technology provider, we were excited to hear that the GBVIMS is being actively implemented and program participants and managers are using the data for decision making. It’s wonderful to be part of such a passionate and professional team.

M&E software – 8 Tips on How to Talk to IT Folks

.

You want to take your M&E system one step further and introduce a proper M&E software? That’s great, because a software has the potential of making the monitoring process more efficient and transparent, reducing errors and getting more accurate data. But how to go about it? You have three options:

  1. You build your own system, for example in Microsoft Excel;
  2. You purchase an M&E software package off-the-shelf;
  3. You hire an IT consultant to set up a customized M&E system according to your organization’s specific requirements.

If options one and two do not work out for you, you can hire consultants to develop a solution for you. You will probably start a public tender to find the most suitable IT company to entrust with this task. While there a lot of things to pay attention to when formulating the Terms of Reference (TOR), I would like to give you some tips specifically about the communication with the hired IT consultants. These insights come from years of experience of being on both sides: The party who wants a tool and needs to describe it to the implementing programmers and being the IT guy (or rather lady) who implements Excel and web-based database tools for M&E.

To be on the safe side, I recommend you to work with this assumption: IT consultants have no clue about M&E. There are few IT companies who come from the development sector, like energypedia consult does, and are familiar with M&E concepts such as indicators, logframes and impact chains. To still get what you need, you should pay attention to the following communication tips:

  1. Take your time explaining what you need: Writing TOR takes time – but it takes even longer and becomes more costly when you hire somebody for something that is not thought through. If you don’t know all the details right from the start, get some expert assistance in formulating terms – it’s worthwhile.
  2. Use graphs: Instead of using words to describe your monitoring logic and the system you need, it is much easier to make graphs to depict the structure, user groups, linking of information, flow of monitoring data etc.
  3. Give examples: When unsure about how to put a feature into words, send a link or a screenshot of the function that you might have come across elsewhere and wish to have in your tool.
  4. Explain concepts and terminology: Many results frameworks work with the terms “input” and “output”. Most IT guys, however, will not have equipment and finished schools in mind, but rather data flows that consist of inputs and outputs. Make sure you clarify this. Also, the term web-based or web monitoring itself is a source of misunderstanding. In the IT world, web monitoring refers to monitoring activity in the internet, for example website visits or monitoring a server. That is probably not what you want when building up an M&E system for e.g. a good governance programme.
  5. Meet in person: In your budget calculation, allow for at least one workshop where you meet in person, for example a kick-off workshop in which you clarify your requirements. This is not only a possibility to ask each other questions, but also to get a feeling of the other party’s language and way of thinking.
  6. Maintain a dialogue: During the implementation phase, make sure to stay in regular touch with the programmers. Ask them to show you updates every once in a while to allow you to give feedback. When you detect that the programmers are heading into the wrong direction, you want to find out rather sooner than later.
  7. Document communication: When we implement web-based systems, we typically create a page within the web platform itself that outlines all the agreed steps. This list serves as a to-do list and an implementation protocol at the same time. It facilitates communication, particularly when on both sides multiple persons are involved that are not always present in all meetings or phone calls.
  8. Be prepared for misunderstandings: They happen. It’s normal. Plan for some buffer days before launching the final tool.

In general, the implementation phase should allow for some flexibility. As both parties learn from each other during the process, you should not be afraid to adjust initial plans, because the final tool will benefit greatly from it (if the contract has some flexibility). Big customized IT projects take some time.

If you need more advice on this matter and some more insights on setting up IT-based M&E systems, please feel free to contact me any time! In the past we supported some clients by setting up a prototype for their web-based M&E system with our flexible WebMo approach. During the prototype process the client learnt a lot and afterwards it was quite easy for other developers to copy the prototype and migrate it to their e.g. Microsoft Share Point environment (in case your IT guys don’t believe in Open Source or don’t want to host third-party software on their server).

Please leave your comments, if you think that I have missed an important communication rule.

Good luck!

M&E Squared: Evaluating M&E Technologies

by Roger Nathanial Ashby, Co-Founder & Principal Consultant, OpenWise

The universe of MERL Tech solutions has grown exponentially. In 2008 monitoring and evaluating tech within global development could mostly be confined to mobile data collection tools like Open Data Kit (ODK), and Excel spreadsheets to analyze and visualize survey data. In the intervening decade a myriad of tools, companies and NGOs have been created to advance the efficiency and effectiveness of monitoring, evaluation, research and learning (MERL) through the use of technology. Whether it’s M&E platforms or suites, satellite imagery, remote sensors, or chatbots, new innovations are being deployed every day in the field.

However, how do we evaluate the impact when MERL Tech is the intervention itself? That was the question and task put to participants of the “M&E Squared” workshop at MERL Tech 2017.

Workshop participants were separated into three groups that were each given a case study to discuss and analyze. One group was given a case about improving the learning efficiency of health workers in Liberia through the mHero Health Information System (HIS). The system was deployed as a possible remedy to some of the information communication challenges identified during the 2014 West African Ebola outbreak. A second group was given a case about the use of RapidPro to remind women to attend antenatal care (ANC) for preventive malaria medicine in Guinea. The USAID StopPalu project goal was to improve the health of infants by increasing the percent of women attending ANC visits. The final group was given a case about using remote images to assist East African pastoralists. The Satellite Assisted Pastoral Resource Management System (SAPARM) informs pastoralists of vegetation through remote sensing imagery so they can make better decisions about migrating their livestock.

After familiarizing ourselves with the particulars of the case studies, each group was tasked to present their findings to all participants after pondering a series of questions. Some of the issues under discussion included

(1) “How would you assess your MERL Tech’s relevance?”

(2) “How would you evaluate the effectiveness of your MERL Tech?”

(3) “How would you measure efficiency?” and

(4) “How will you access sustainability?”.

Each group came up with some innovative answers to the questions posed and our facilitators and session leads (Alexandra Robinson & Sutyajeet Soneja from USAID and Molly Chen from RTI) will soon synthesize the workshop findings and notes into a concise written brief for the MERL Tech community.

Before the workshop closed we were all introduced to the great work done by SIMLab (Social Impact Lab) in this area through their SIMLab Monitoring and Evaluation Framework. The framework identifies key criteria for evaluating M&E including:

  1. Relevance – The extent to which the technology choice is appropriately suited to the priorities and capacities of the context of the target group or organization.
  2. Effectiveness – A measure of the extent to which an information and communication channel, technology tool, technology platform, or a combination of these attains its objectives.
  3. Efficiency – Measure of the outputs (qualitative and quantitative) in relation to the inputs.
  4. Impact – The positive and negative changed produced by technology introduction, change in a technology tool, or platform on the overall development intervention (directly or indirectly; intended or unintended).
  5. Sustainability – Measure of whether the benefits of a technology tool or platform are likely to continue after donor funding has been withdrawn.
  6. Coherence – How related is the technology to the broader policy context (development, market, communication networks, data standards & interoperability mandates, and national & international law) within which the technology was developed and implemented.

While it’s unfortunate that SIMLab stopped most operations in early September 2017, their exceptional work in this and other areas lives on and you can access the full framework here.

I learned a great deal in this session from the facilitators and my colleagues attending the workshop. I would encourage everyone in the MERL Tech community to take the ideas generated during this workshop and the great work done by SIMLab into their development practice. We certainly intend to integrate much of these insights into our work at OpenWise. Read more about “The Evidence Agenda” here on SIMLab’s blog. 

 

 

 

Making (some) sense of data storage and presentation in Excel

By Anna Vasylytsya. Anna is in the process of completing her Master’s in Public Policy with an emphasis on research methods. She is excited about the role that data can play in improving people’s lives!

At the MERL Tech Conference, I attended a session called “The 20 skills that solve 80% of M&E problems” presented by Dr. Leslie Sage of DevResults. I was struck by the practical recommendations Leslie shared that can benefit anyone that uses Excel to store and/or present data.

I boiled down the 20 skills presented in the session into three key takeaways, below.

1. Discerning between data storage and data presentation

Data storage and data presentation serve two different functions and never the two shall meet. In other words, data storage is never data presentation.

Proper data storage should not contain merged cells, subheadings, color used to denote information, different data types within cells (numbers and letters), more than one piece of data in a cell (such as disaggregations). Additionally, in proper data storage, columns should be the variables and rows as the observations or vice versa. Poor data storage practices need to be avoided because they mean that you cannot use Excel’s features to present the data.

A common example of poor data storage:

Excel 1

 

One of the reasons that this is not good data storage is because you are not able to manipulate this data using Excel’s features. If you needed this data in a different format or you wanted to visualize it, you would have to do this manually, which would be time consuming.

Here is the same data presented in a “good” storage format:

2Good_Data_Storage

 

Data stored this way may not look as pretty, but it is not meant to be presented or read in within the sheet. This is an example of good data storage because each unique observation gets a new row in the spreadsheet. When you properly store data, it is easy for Excel to aggregate the data and summarize it in a pivot table, for example.

2. Use Excel’s features to organize and clean data

You do not have to use precious time to organize or clean data manually. Here are a few recommendations on Excel’s data organization and cleaning features:

  • To join to cells that have text into one cell, use the concatenate function.
  • To split text from one cell into different cells, use the text to columns
  • To clean text data, use Excel’s functions: trim, lower, upper, proper, right, left, and len.
  • To move data from rows into columns or columns into rows, use Excel’s transpose feature.
  • There is a feature to remove duplicates from the data.
  • Create a macro to automate simple repetitive steps in Excel.
  • Insert data validation in an excel spreadsheet if you are sending a data spreadsheet to implementers or partners to fill out.
    • This restricts the type of data or values that can be entered in certain parts of the spreadsheet.
    • It also saves you time from having to clean the data after you receive it.
  • Use the vlookup function in Excel in your offline version to look up a Unique ID
    • Funders or donors normally require that data is anonymized if it is made public. While not the best option for anonymizing data, you can use Excel if you haven’t been provided with specific tools or processes.
    • You can create an “online” anonymized version that contains a Unique ID and an “offline version” (not public) containing the ID and Personally Identifiable Information (PII). Then, if you needed to answer a question about a Unique ID, for example, your survey was missing data and you needed to go back and collect it, you can use vlookup to find a particular record.

3. Use Excel’s features to visualize data

One of the reasons to organize data properly so that you can use Excel’s Pivot Table feature.

Here is an example of a pivot table made from the data in the good data storage example above (which took about a minute to make):

3Pivot_Table

Using the pivot table, you can then use Excel’s Create a Chart Feature to quickly make a bar graph:

4BarGraph

In the Future

I have fallen prey to poor data storage practices in the past. Now that I have learned these best practices and features of Excel, I know I will improve my data storage and presentation practices. Also, now that I have shared them with you; I hope that you will too!

Please note that in this post I did not discuss how Excel’s functions or features work or how to use them. There are plenty of resources online to help you discover and explore them. Some helpful links have been included as a start. Additionally, the data presented here are fictional and created purely for demonstration purposes.