Overview of NASA AIST Evaluations

Abstract/Agenda: 

This session will give a broad overview of the first round of technology assessments coordinated by ESIP for NASA AIST, including stepping through the evaluation process as such:

  • AIST Selects Appropriate Candidates
  • Evaluators from Earth Sciences and Informatics Community
  • Evaluators Collaborate with AIST PIs on Test Plan
  • Evaluators Carry Out Testpaln
  • ESIP Coordinates Communication over Slack, OSF, Email, Telecons
  • Evaluators Submit Final Report Content
  • ESIP Edits Final Report

We will also discuss evaluator feedback, lessons learned and next steps. 

Notes: 
  1. Independent Technology Assessment within the ESIP Testbed

  • AIST

  • A technology Readiness Level (TRL) is used to assess project maturity (internal only)

  • ESIP/AIST Collaboration:

    • ESIP providing independent assessment of AIST project TRL

    • Identifying opportunities/roadblocks for projects

    • Evaluation Goals

      • Achieve consistency, traceability and defensibility of evaluation results

      • Be recognized as comprehensive and fair

      • provide a valuable experience for PIs and project evaluators

  • Evaluation Components

    • Milestone Completions Review

    • TRL Objective Completion Review

    • Reporting

      • Evaluator Information

      • Evaluation Components

      • Test Plan

      • Infusion Potential

      • Other

      • Submit

  • Timeline: this is her "workflow" slide too, but the timeline is important

  • Workflow

    • AIST Selects Projects

    • ESIP Selects Evaluators

      • ESIP solicits suggestions from PI

      • ESIP reaches out to community

      • Telecons

    • Evaluator/AIST PIs create Test Plan

      • Access Restrictions Software Readiness

    • Evaluators carry out testing plan - did this seem to work well?

      • standardization on this may be difficult

      • strong human component

      • Checklist for software development best practices

        • Supportability, Portability, Testability, Accessibility, Community, Governance, Licensing, Copyright, Installability, Buildability, Learnability, Documentation, Understandability, Friendliness

        • In addition: code structure, modules, the quality of the codes, end user evaluation

      • Would love to hear from John Graybeal about his process for developing assessment; suggestions/ideas for the next round? With the Disaster Cluster project, for example?

        • Key to developing this was initial Software Sustainability Institute

        • Future work could be in defining further details, brining this up to date - or perhaps providing a simpler version?

        • Nice thing about the way its set up now is that the spreadsheet is interactive: if you are evaluating a TRL 6 project, more questions are exposed than if you evaluate a TRL 3 project.

      • Possible training track for ESIP people who want to grow in their capabilities to design good software/evaluate good software

        • Science Software Cluster collaboration, perhaps?

          • Perhaps they evaluate the TEF

      • Peter Fox & Chris Lynnes have done "Infusion Readiness" evaluations - are these artifacts relevant for further TEF efforts?

    • Evaluators fill out evaluation structure

    • Evaluators submit final report Content

    • ESIP edits and submits reports to AIST

  • FeedBack from evaluators

  • Lessons learned

  • Outlook: Provide the Earth sciences community with a novel, needed evaluation framework to improve technology development.

 

  • More on the Evaluation Spreadsheet

    • Official TRL checklist is different than John's spreadsheet criteria

    • Anne Wilson has some good ideas from a science software development perspective, to include in an evaluation.

    • Lots of interest in seeing John's spreadsheet - perhaps this is a good example to bring to the "Art of Critique" concept?

    • Spreadsheet also becomes a potential 'vetted checklist' that can help project managers/software leaders check against 'industry' software standards

      • TEF for the TEF?

Attachments/Presentations: 
AttachmentSize
File EVALUATION_SESSION.pptx14.26 MB
Citation:
Burgess, A.; Overview of NASA AIST Evaluations; Winter Meeting 2016. ESIP Commons , October 2015