PK E meta.xml'XMind3.4.1.201401221918561621#F3F4F9PK, ' PK E content.xmliProject InformationPreferenceFeelingsValuesAestheticsBiasesReferenceArtifactsBuilders' referencesProblem descriptionArchitecture diagramsFlowchartsProject charterSpecificationsData dictionarySource codeProject demosTimelinesGlossariesAll of these references are heuristic in nature; that is, they are fallible ways to solve certain kinds of problems. They can be useful sources of project information, for sure. Yet they are often imperfect in many ways: incomplete, ambiguous, self-contradictory, vague, incoherent, out of date... You get the picture. Artifacts can be valuable, but they can also be unhelpful and misleading---and as with bugs, you can't also know right away that you're being misled.The good news is that, while this is a problem for testing, it's precisely the kind of problem that testing can greatly help to resolve. Disagreement between explicit requirements is a hint that confusion exists somewhere else. Mysterious silences or omissions in descriptions of the product might indicate designers aren't thinking about that part of the design. If the testers don't have a straight story, it's possible that others (like developers) don't either.There may be value in having the testers create or revise specific artifacts for the whole team, if those artifacts are missing or messy. Testing and documenting a product or an API can be a powerful way to structure learning about it and to discover problems at the same time. But in Rapid Testing, we urge people not to create a formal, heavyweight document unless the value of doing so justifies the cost.All of these references are heuristic in nature; that is, they are fallible ways to solve certain kinds of problems. They can be useful sources of project information, for sure. Yet they are often imperfect in many ways: incomplete, ambiguous, self-contradictory, vague, incoherent, out of date... You get the picture. Artifacts can be valuable, but they can also be unhelpful and misleading---and as with bugs, you can't also know right away that you're being misled.
The good news is that, while this is a problem for testing, it's precisely the kind of problem that testing can greatly help to resolve. Disagreement between explicit requirements is a hint that confusion exists somewhere else. Mysterious silences or omissions in descriptions of the product might indicate designers aren't thinking about that part of the design. If the testers don't have a straight story, it's possible that others (like developers) don't either.
There may be value in having the testers create or revise specific artifacts for the whole team, if those artifacts are missing or messy. Testing and documenting a product or an API can be a powerful way to structure learning about it and to discover problems at the same time. But in Rapid Testing, we urge people not to create a formal, heavyweight document unless the value of doing so justifies the cost.Testing referencesGeneralRisk ListTesting HeuristicsProject-specificProduct and project risk listCoverage outlineTest reportsBug reportsComparable productsPrevious version of the productCompetitive productsKnown good example outputKnown bad example output(remember, a single feature, function, or data point can be comparable; that is, provide a basis for comparison on some point)ToolsAPIsTest frameworksOutput checking toolsIncludes presumed good data and procedures for comparisonData generationVisualizationCharting processesShowing patterns or trends in datae.g. sparklines, Illustration and sketchingFeatures in existing toolse.g. Excele.g. conditional formattinge.g. sparklinesMathematical/statistical analysisDocumentsCompany WebsiteGeneralProduct-specificLaws and regulationsExternal regulationsStandardsUser guideAPI documentationSubject literature from the product's domainRepositoriesInternalSupport databaseBug-tracking databaseProject management toolsExternalGoogleWikipediaLibrariesBookstoresMy personal library often grows when I do testing in a new domain.My personal library often grows when I do testing in a new domain.
Standards bodiesConferenceWhoPrimary clientProduct OwnerDevelopment managerTest managerTest leadIn Rapid Testing, your primary client is the person to whom you report directly; the person who has authority over the quality and structures of your work; the person you would probably say was your boss. There are lots of variations in the world.In Rapid Testing, your primary client is the person to whom you report directly; the person who has authority over the quality and structures of your work; the person you would probably say was your boss. There are lots of variations in the world.Key stakeholdersRequestersManagersDevelopersMaintenance peopleProject managersDesignerTechnical supportStakeholdersUsers (focus groups)Subject matter expertsIn-house counselUsersSuppliersOthersOutside expertsThe receptionistPrevious testersFriendsWhat or HowDirect conversationWorking interactionOne-on-one meetingGroup meetingsInterviewsConferencesSocial interactionMediatedInstant messagingProject wikiTelephoneEmailMailInferenceImaginationProject management expectationsIntended usageConsistency principlesPatterns of familiar problemsExplainabilityWorldHistoryImageClaimsComparable productsUser desiresProductPurposeStandards and statutesThe product should be consistent with theseThe product should be inconsistent with theseExperienceExperiments on the productDirect user interaction with the productPreparation for automated checkingInterpretation of automated checking resultsInteraction with the product's APIsIn similar and different environmentsOutput from these experimentsTools used for developmentFeelings experienced during testingImpatienceConfusionFrustrationAnnoyanceSupriseAmusementFearCuriosityStories, memories and tacit knowledge from...Previous iterationsThis projectBusiness goalsPrevious projectsThis companyBusiness goalsBusiness process and cultureFormal educationWork-specific trainingOther companiesUsing other productsLife outside the development worldSources of Project InformationThis mind map was prepared by Michael Bolton. It includes suggestions from the "Every Tester Has a PRICE" session at EuroSTAR 2014.This is not intended to be a comprehensive listing, but the outcome of one brainstorming session with extra ideas added later.PRICE comes from the initial letters of the top-level categories: Preference, Reference, Inference, Conference, and Experience. The mnemonic is intended to provide a trigger to think of new sources of information when testers are stuck.Information sources are closely related to oracles--means by which we recognize problems when they happen during testing. Problems in the product and the project can be identified by observing undesirable consistencies and inconsistencies between related things.These categories tend to overlap; that's okay. The purpose of this exercise is not to come up with an idealized map, but to learn and discover things as we sort and re-sort ideas into categories. In mind mapping, just as in testing, the mind is more important than the map.Project information comes in many forms. In Rapid Testing, we identify Experience, Inference, Conference, and Reference as basic families of information sources. For EuroSTAR 2014, I playfully added "Preference" so that I could say that "every tester has a PRICE", focused the latter four letters, at least. Nonetheless, "Preference" is something that drives Experience and Conference. It's at the heart of testing work to find out what people prefer; what's good enough or not; what they like and don't like; what they want and don't want. After all, testing is all about exploring the product and the systems around it to help the business decide whether the product they've got is the product they want. Preferences are central to our individual and shared feelings and mental models of the product.Project information comes in many forms. In Rapid Testing, we identify Experience, Inference, Conference, and Reference as basic families of information sources. For EuroSTAR 2014, I playfully added "Preference" so that I could say that "every tester has a PRICE", focused the latter four letters, at least. Nonetheless, "Preference" is something that drives Experience and Conference. It's at the heart of testing work to find out what people prefer; what's good enough or not; what they like and don't like; what they want and don't want. After all, testing is all about exploring the product and the systems around it to help the business decide whether the product they've got is the product they want. Preferences are central to our individual and shared feelings and mental models of the product.Sheet 1PK#@:i i PK E 2 Revisions/2ujdtdhc3n4mss793tau3uhpo4/revisions.xml,PKXm PK eE <