Difference between revisions of "Forms"
Mark Bridger (talk | contribs) (→Determining what to change and managing the change) |
Mark Bridger (talk | contribs) (→Determining the right structure for you?) |
||
Line 27: | Line 27: | ||
Identify how the data will be used in your processes. If you are using this information for reporting, will the answers be anonymized and rolled up into an external facing public search or maybe sent to a government agency or 3rd party? If you are using the information for decision making, how will the answers be made available inside the system and restricted to specific people? How will you mitigate bias based on demographic or personal information submitted? Are you going to allow people to self-identify and is that information optional? | Identify how the data will be used in your processes. If you are using this information for reporting, will the answers be anonymized and rolled up into an external facing public search or maybe sent to a government agency or 3rd party? If you are using the information for decision making, how will the answers be made available inside the system and restricted to specific people? How will you mitigate bias based on demographic or personal information submitted? Are you going to allow people to self-identify and is that information optional? | ||
− | ==Determining the right structure for you?== | + | ==Determining the data right structure for you?== |
Now that you have determined the questions you want to ask to get the data you need, consider the what is the right data structure for you within the context of SmartSimple. | Now that you have determined the questions you want to ask to get the data you need, consider the what is the right data structure for you within the context of SmartSimple. | ||
Revision as of 13:46, 18 May 2023
Contents
- 1 Overview
- 2 Form - Essentials
- 2.1 Determining the right data for you?
- 2.2 Determining the data right structure for you?
- 2.3 Determining the structure and layout of questions
- 2.4 Determining how many questions to ask, when to ask them, who has access to that data, and for how long
- 2.5 Determining the quality of questions captions and instructions
- 2.6 Determining actionable insights for improving your form
- 2.7 Determining what to change and managing the change
Overview
Grant and research application forms are often the first step in the grant making process. These forms are typically used to collect information about an applicant (how the applicant intends to use funds received, what are the applicant’s goals, expected outcomes, and timelines). The answers to the form questions are used in the process to decide if the applicant will receive funding.
On the surface, the application form is a set of questions, but like an iceberg there is more below the surface than meets the eye. So how do you ensure your application forms are collecting the right data and providing the best user experience to your community?
This article presents a framework of 7 things to consider when designing and building application forms in SmartSimple. These 7 considerations will help you create forms that will collect the right data and provide the best user experience for your community.
Form - Essentials
Determining the right data for you?
Maybe you have an existing form or you know exactly the questions you want to ask. Before you start building your form in SmartSimple, ask yourself: who, what, why, and how.
Who are we collecting this data for?
Identify who needs to use this data. Some data may be used by internal teams. If so, which teams and who specifically? Some data may be used for reporting to government agencies or 3rd parties or to show the outcomes of funding. If so, delineate who needs the data.
What data do I need to collect?
Even if you are using a template, a common form, or an existing form, ask yourself: what is the data I really need to collect and do I have a question that will return that data? If you don’t need a question, remove it. Every question you add to a form makes the form harder and longer to fill out. All the information you don’t need distracts from what is important.
Why are we collecting this data?
Identify why you are asking each question. Some answers may be essential for making your funding decision, others might be used for reporting outcomes. If you don’t need to know an applicant's mission statement to make a decision, maybe you don’t need to ask that question.
How are we going to use the collected data?
Identify how the data will be used in your processes. If you are using this information for reporting, will the answers be anonymized and rolled up into an external facing public search or maybe sent to a government agency or 3rd party? If you are using the information for decision making, how will the answers be made available inside the system and restricted to specific people? How will you mitigate bias based on demographic or personal information submitted? Are you going to allow people to self-identify and is that information optional?
Determining the data right structure for you?
Now that you have determined the questions you want to ask to get the data you need, consider the what is the right data structure for you within the context of SmartSimple.
What custom fields are right for you?
The following custom fields may be well suited based on the following answer styles:
Answer Style | Custom Field |
---|---|
Text based answers including dates and numbers | Text Box (Text Single Line, Text Multiple Lines, Date, Date and Time, Email, Number, Phone Number) |
Selecting a predefined answer option | Select One/Many (Radio Buttons, Checkboxes, Dropdown List with Predefined Options or Dynamic Content) |
Selecting a predefined answer option when there are many options | Lookup (Autocomplete) |
Reference documents, files, media | Upload (Multiple File storage with or without Media Library enabled, Image) |
Tabular information | Upload (Multiple File storage with or without using a parser) Advanced/ Basic Data table |
How will you get the data to the people who need it?
Different people may need to access data in different ways. For example, you may need to aggregate the answers or surface the answers in a report export, PDF, JSON file, web-based portal, public search, map plot, or something completely custom using the API. How you intend to output the data influences how the data needs to be stored (and where) as well as which features and fields will work best for you within SmartSimple.
Where the data will be stored within your system?
Consider where the data you collect is stored. For example, information about a grantee might best be stored on that grantee's profile or their organization’s profile. This way the user is not required to enter the same information into multiple forms. If information needs to be displayed in multiple locations or multiple systems, consider where the single source of truth will be and what features will be used to update other locations such as system variables, data exchange, or the SmartConnect API. If data is coming from an outside source, consider which integration you will use. For example, you may choose to populate an organization's details based on the information from the IRS verification service instead of having the user enter their own organization details.
How will you coordinate data structures across programs?
You may have many programs running for many years across many teams. Think about how you can ensure the use of the same fields and structures across multiple programs and teams. Also consider if your application cycles require the use of versioning so form changes are only applied to specific cohorts of applicants.
How will the process flow?
You may have many people contributing to the form, including the applicant and their team as well as internal and external teams taking part in the reviewing, advising, or approving process. Will you be using features like invitations, annotations, and group email? Consider the full application lifecycle including how outcomes will be reported and how you will demonstrate the outcome of the funding such as who benefited. Also consider what types of communication you will have with the applicant. Will you use notes, annotations, emails, a separate UTA to manage these communications? Can applicants upload an audio/video application or explanation using the media library or will all communication be text based? How can your users get help along the way if things don’t go as planned or something unexpected happens? Will you use the +AI integration as part of the pre-screening process and allow other roles to get help from the artificial intelligence as part of their processes. Will you use the e-signature integration for digital signing documents or some other integration, if so, which vendor?
Determining the structure and layout of questions
Now that you have determined the structure of the data so you can output the data your community needs in the ways they need it, consider the surface of the form.
How will you logically group content?
What groupings will make sense to your community's mental model? For example, will you put all the questions related to applicant outcomes under the same title bar, will you place related content on the same tab, if so what logic will you use for grouping like content? Grouping similar content under a title bar makes it easier for users to jump to the desired content. Will you use the title bar navigation pane to jump to parts of your application. Will you favor vertical scrolling for an improved mobile experience or go with a horizontal tabbed layout. Exercises such as card sorting can help with designing the information architecture of your form.
How will you present the content?
Consider the following content presentation options:
- Present the content in a single or multiple column layout
- Set the display order and visibility of questions
- Use the linked record list to show records in-line
- Use the Title Bar Navigation Pane for jumping to sections of content
- Add an instructions custom field for section level clarity
- Add countdown timer for expiring calls
Determining how many questions to ask, when to ask them, who has access to that data, and for how long
Now that we have determined the organization and layout of the questions we want to consider how many questions to add and if we want everyone to see every question and when they should be answered.
How many questions are too many?
The more questions you ask, the less likely the form is to be completed. The right amount of questions depends on your community and your needs. Be mindful that a lot of questions create a barrier of entry, especially for the most vulnerable applicants who may not be able to hire a grant writer. How many questions can also depend on the amount of funding and your relationship with the applicant. You might be able to use a much shorter trust-based application for applicants requesting a small amount of money or for applicants who have received monies in the past and met their reporting requirements. Also consider if each question needs to be mandatory and if you already have the answer because you asked that question last year or as part of the signup process. If a question could be optional, do you still need to ask it?
When should a question be shown?
Consider that you don’t need to ask every question all at once. You could start with an eligibility questionnaire to see if the user meets the criteria for the program. Then ask some questions at the start of the process. Then through progressive disclosure ask them more questions as they move through your process. You may want to apply some conditional logic around some questions or use Dynamic Field Visibility Controls as not all questions might be needed for all circumstances.
Who can access the data and for how long?
Some questions may involve personally identifiable information (PII). Make sure you are transparent about why you are collecting this data, how you are using it, and for how long you will keep the data. Consider your data retention and masking policies as well as legislation such as GDPR. Also consider which people (roles) will have access to what data. Consider concealing some data that might influence reviewers and decision makers and help reduce bias around diversity, equity, and inclusion (DEI). You may also want users to accept your own privacy and security policies before allowing them to apply to your programs.
Now that we are asking the right questions, consider the quality of our questions.
How well are the questions working?
Based on the data you needed to gather, are you receiving the answers you needed? If not, should the question be rephrased or clarified? If you are not getting the data you want, is the question still valid?
How well are the instructions working?
If you are including instructions, are they concise? Do they use the clearest possible language or do they over explain and overcomplicate? Are you using the right structure for your instructional text, sometimes you might want to add text under the caption, other times in a tooltip, or maybe you want to add an instructions field with visibility permissioned by roles.
Determining actionable insights for improving your form
Now that we have determined how well our questions and instructions are working, consider how to identify the challenges in your user's journey.
What challenges are users experiencing in their journeys?
Consider how you discover the challenges your users are experiencing as part of their user journey. Are your users completing their tasks or are applications being abandoned? If people are getting stuck, where in the process is that happening? Consider the service design of your application process. There are a number of methods to conduct user research. Behaviorally, you can watch what people do by looking at metrics. Qualitatively, you can determine why and how users are doing what they do through usability testing, interviews, and focus groups. Attitudinally, you can assess what people say through surveys and check the information architecture of the forms through exercises like card sorting. Quantitatively, you can look at reports, list views, and statuses to see how many people are finding challenges and in which areas. By analyzing trends with incomplete fields and applications you may gain insight into potential roadblocks.
Keep in mind there may be many users in your system such as applicants, program managers, reviewers, stakeholders, funders, and administrators. Each user role may have different challenges to consider. Beyond user roles, the people in your system also have differing abilities and expertise levels so be mindful to be inclusive.
Determining what to change and managing the change
Once we have established the challenges our users are experiencing we look at what to change and how to manage that change.
What to change?
You have a list of pain points for your users but limited time, budget, and technical constraints to consider. Try consolidating your list of challenges and rank them by what you perceive as benefit (value), effort, and risk. Changes could be as simple as changing some text or as complex as changing your internal processes or adopting a feature or integration.
How to change?
Consider how you will implement the change. Will you make the change internally or have an external party make the change? If the change will be made externally will the change be made by SmartSimple (Support, implementation team, configuration hours, an RFS) or a SmartSimple partner). Do you need stakeholder buy in? How will people be notified of the upcoming change and when will it be applied? Is there a review process for the change? Do you need to use features like Versioning, Draft Portal, Batch Update, Autoloader or the test to production (T2P) tool? How do you measure the impact and success of your change?
Creating an application form that collects the right data with the best user experience involves more than adding questions to the form. Consider the 7 key areas above before you create the form. What is right for you will be as unique as your evolving needs.