Application Envisioning idea
Examples from three knowledge work domains:
(Illustrated above) A financial trader returns to work to find that a new version of his group’s trading application has been installed. To test the installation, he makes some random trades between fake organizations, knowing that he can easily cancel these test deals once he is confident that the updated tool is working properly with the firm’s many interconnected systems.
Knowledge workers’ early experiences of new technologies often involve critical minded testing. As part of adopting a new computing product, individuals and their organizations may need to see that a tool is functioning consistently and in line with its marketing claims. The more important the role of an application in work activities (A), the more emphasis may be placed on ensuring that it is operating as desired before putting it into use (B5, C7, C10, G4). Even after a new tool has been successfully tested within a workplace, individual workers may go so far as to run their own verification processes to ensure that their own high standards are being met (E5) and to gain a better understanding of how a product works (K2, K6).
A scientist needs to make sure that a new data collection application provides the same experimental results as her lab’s previous tool. To ensure that the results are comparable, she calibrates the new product and uses it to run test procedures on
a set of clinical samples that have already been run using the previous tool.
An architect completes a brief tutorial in her building modeling application, during which the tool runs checks to measure the performance of her current computing infrastructure.
What these verification processes could entail is heavily dependent on the roles that product teams are envisioning for their application concepts. To support potential testing scenarios in targeted work practices, teams can sketch guiding functionality concepts that could provide users with clear, instrumental outputs (F6). Definers and designers may actually mandate and automatically initiate some standard verifications within their products, such as testing certain features in the context of an organization’s IT infrastructure (K10).
When product teams do not actively consider how individual workers and organizations will need or want to verify an application’s successful operation, resulting computing tools may be difficult to adopt with the level of confidence that is often needed to support real world use (D2, D3). If workers’ unguided, ad hoc verification processes produce mixed results, negative halo effects may lead people to abandon key functional areas or the entirety of a product or larger brand.
See also: C1, C9, F11, G1, G3, H, I, J, K, M
Application Envisioning questions:
More specific questions for product teams to consider:
Who is responsible for “officially” testing the functionality of new or updated applications within the organizations that your team is targeting?
What larger technology trends and advanced analogies to other domains could valuably inform your team’s envisioning of product verification experiences?
Given the scope of knowledge work activities that your team is targeting, what types of operational checks might customer organizations require of your computing tool?
After an organization has tested your application, what verifications might targeted knowledge workers want to repeat themselves in order to confidently incorporate your new or updated product into their own work practices?
What operational verifications could be important to individual workers but may not seem as important to their organizations?
What tests might be beneficially rerun as your application is used through advancing versions of an organization’s IT infrastructure over time?
How might your team envision the interactive procedures of infrequently accessed testing functionalities in a way that could allow users to easily and accurately run them without additional instruction?
What role could automation play in the flow of testing actions?
What streamlined instrumental displays might your team sketch with the goal of decisively presenting certain verification outcomes?
What potential errors or problems could arise in certain testing processes? How might they be prevented or handled in your sketched functionality concepts?
How might your ideas about application verification tie into your sketched directions for instructional assistance?
Do you have enough information to usefully answer these and other envisioning questions? What additional research, problem space models, and design concepting could valuably inform your team’s application envisioning efforts?
< PREVIOUS PAGE | NEXT PAGE >
Back to top | View Table of Contents