100 Note-Taking Ideas for Data Scientists & Analysts in 2026

Streamline data review meetings, stakeholder presentations, and experiment tracking with these 100 note-taking ideas tailored for data scientists and analysts.

For data scientists, business analysts, and ML engineers, effectively capturing insights during data review meetings and stakeholder presentations is crucial. This resource provides 100 note-taking ideas designed to help you clearly document technical discussions, track experiment results, and communicate complex information efficiently, directly addressing the pain points of bridging technical and non-technical understanding.

100 items

Meeting & Presentation Notes

Pre-define Presentation Goal

Beginner

Before any stakeholder presentation, note down the primary objective you want to achieve or the key decision you need to facilitate.

stakeholder presentations

Audience-Specific Glossary

Beginner

Create a quick reference list of technical terms that might need simplification for non-technical stakeholders during a presentation.

stakeholder presentations

Action Items & Owners (Real-time)

Intermediate

Dedicate a section of your notes to clearly list action items, assignees, and due dates as they arise in data review meetings.

data review meetings

Key Decisions Log

Intermediate

Maintain a running log of all significant decisions made during a meeting, including the rationale and who agreed.

data review meetings

Stakeholder Questions Bank

Beginner

Keep a separate section for questions raised by stakeholders, noting if they were answered or need follow-up analysis.

stakeholder presentations

Visual Cues for Emphasis

Beginner

Use stars, underlines, or different colors to highlight critical data points or insights discussed in a meeting.

data review meetings

Parking Lot for Out-of-Scope Items

Intermediate

Designate a 'parking lot' section for ideas or questions that are important but not directly relevant to the current meeting agenda.

requirements gathering

Hypotheses & Assumptions Tracking

Intermediate

Document any hypotheses being tested or assumptions being made during data discussions, especially in requirements gathering.

requirements gathering

Dashboard Walkthrough Flow

Intermediate

When reviewing a dashboard, note the logical flow and narrative you want to convey, anticipating potential user questions.

dashboard walkthroughs

Feedback Categorization (Dashboard)

Intermediate

Organize feedback received during a dashboard walkthrough into categories like 'UI/UX', 'Data Accuracy', 'New Feature Request'.

dashboard walkthroughs

Model Performance Metrics Checklist

Advanced

Create a checklist of key performance metrics (e.g., accuracy, precision, recall) to cover during model review sessions.

model review sessions

Edge Case Documentation

Advanced

In model review, specifically note down any identified edge cases where the model performed unexpectedly or poorly.

model review sessions

Technical Debt Notes

Advanced

During any technical discussion, make a note of potential technical debt or areas for future optimization.

data review meetings

Interviewer's Prompt/Question (for self-reflection)

Intermediate

If presenting, note down specific questions asked by stakeholders to reflect on how to improve future communication.

stakeholder presentations

Data Source & Lineage Questions

Intermediate

When reviewing data, note any questions about data sources, transformations, or lineage for further investigation.

data review meetings

Impact Assessment Notes

Intermediate

For any proposed change or finding, quickly jot down the potential business impact (positive or negative).

data review meetings

Risk & Mitigation Tracking

Advanced

Identify potential risks related to data quality, model deployment, or project timelines, and note down mitigation strategies.

requirements gathering

Future Iteration Ideas

Beginner

Keep a running list of ideas for future enhancements or analyses that emerge during discussions.

requirements gathering

Key Takeaway Summaries

Beginner

At the end of a meeting or presentation, dedicate a small section to summarize the 2-3 most important takeaways.

data review meetings

Confidence Score for Findings

Intermediate

Assign a subjective 'confidence score' (e.g., low, medium, high) to key insights presented, indicating the robustness of the data.

stakeholder presentations

Experiment Tracking & Analysis

Experiment Goal & Hypothesis

Beginner

Clearly state the primary goal of the experiment and the specific hypothesis you are testing before starting.

experiment tracking

Experimental Design Parameters

Intermediate

Document all critical parameters of your experiment design: sample size, control/treatment groups, duration, metrics.

experiment tracking

Data Sources & Preprocessing Steps

Intermediate

Detailed notes on where the data came from, any cleaning, transformation, or feature engineering performed.

experiment tracking

Model Architecture & Hyperparameters

Advanced

For ML experiments, record the specific model architecture used and all tuned hyperparameters.

experiment tracking

Key Metrics & Evaluation Criteria

Intermediate

List the primary and secondary metrics used to evaluate experiment success, along with their definitions.

experiment tracking

Observed Results & Deviations

Intermediate

Regularly log the actual results, noting any unexpected deviations from expected outcomes.

experiment tracking

Statistical Significance Notes

Advanced

Document the p-values, confidence intervals, or other statistical tests performed to assess significance.

experiment tracking

Learnings & Next Steps

Beginner

Summarize what was learned from the experiment and outline the immediate next steps or follow-up experiments.

experiment tracking

Feature Importance Insights

Advanced

If applicable, note down the most influential features identified by the model or analysis.

model review sessions

Bias & Fairness Considerations

Advanced

Document any detected biases in the data or model, and potential strategies for mitigation.

model review sessions

Experiment Version Control

Intermediate

Use version numbers or dates to clearly differentiate between different iterations of an experiment.

experiment tracking

Environmental Setup (e.g., library versions)

Advanced

Record the specific software versions, libraries, and computational environment used to ensure reproducibility.

experiment tracking

Challenges & Roadblocks Encountered

Intermediate

Note down any difficulties faced during the experiment, such as data quality issues or model convergence problems.

experiment tracking

Visualization Ideas for Results

Intermediate

Jot down ideas for how best to visualize the experiment's results for different audiences.

stakeholder presentations

Comparison with Baseline/Previous Models

Advanced

Always include notes comparing the current experiment's performance against a established baseline or previous model iterations.

model review sessions

Cost-Benefit Analysis Notes

Advanced

For A/B tests or new feature deployments, calculate and note the potential business impact in monetary terms if possible.

experiment tracking

Ethical Implications

Advanced

Consider and note any ethical implications of the experiment or model deployment, especially concerning sensitive data.

model review sessions

Reproducibility Checklist

Intermediate

Create a checklist to ensure all necessary components (code, data, environment) are documented for reproducibility.

experiment tracking

Unexpected Findings

Intermediate

Document any serendipitous discoveries or unexpected patterns observed during the analysis, even if not directly related to the hypothesis.

experiment tracking

Confidence in Deployment

Advanced

After model review, note the team's collective confidence level in deploying the model to production.

model review sessions

Requirements & Project Management Notes

User Stories & Acceptance Criteria

Intermediate

For each requirement, write a user story ('As a [user], I want to [action], so that [benefit]') and its acceptance criteria.

requirements gathering

Data Dictionary Entries

Intermediate

As new data elements are discussed, start or update a data dictionary with definitions, types, and sources.

requirements gathering

Dependencies & Blockers

Intermediate

Clearly list any dependencies on other teams or projects, and any potential roadblocks that could hinder progress.

requirements gathering

Scope Definition & Exclusions

Intermediate

Explicitly define what is IN scope for the project and, equally important, what is OUT of scope.

requirements gathering

Success Metrics for Project

Intermediate

Document the overarching metrics that will define the success of the entire project, not just individual experiments.

requirements gathering

Stakeholder Roles & Responsibilities

Beginner

Note down who is responsible for what, especially in cross-functional projects, to avoid confusion.

requirements gathering

Technical Constraints & Limitations

Advanced

Document any known technical limitations, such as system performance, data storage, or integration challenges.

requirements gathering

Regulatory & Compliance Requirements

Advanced

If applicable, note down any specific regulatory, legal, or compliance requirements that the project must adhere to.

requirements gathering

Timeline & Milestones Tracking

Beginner

Keep a high-level overview of project timelines, key milestones, and target delivery dates.

requirements gathering

Open Questions Log

Beginner

Maintain a dedicated section for unresolved questions that require further investigation or clarification from stakeholders.

requirements gathering

Decision Matrix (Pros/Cons)

Intermediate

When faced with choices, create a simple pro/con list for each option to aid decision-making and document the rationale.

data review meetings

Glossary of Business Terms

Beginner

Beyond technical terms, maintain a glossary of specific business terminology relevant to the project for universal understanding.

requirements gathering

Use Case Scenarios

Intermediate

Describe different scenarios in which the data product or model will be used, identifying user interactions and expected outcomes.

requirements gathering

Error Handling & Fallback Strategies

Advanced

Document how the system should behave when errors occur or if data sources become unavailable.

requirements gathering

Security & Privacy Considerations

Advanced

Note down any specific requirements or concerns related to data security, privacy, and access control.

requirements gathering

Resource Allocation Notes

Intermediate

Keep track of which team members are assigned to specific tasks or components of the project.

data review meetings

Definition of Done (DoD)

Intermediate

Clearly articulate the criteria that must be met for a task, feature, or project increment to be considered 'done'.

requirements gathering

Post-Mortem Learnings

Intermediate

After project completion, document key learnings from successes and failures to improve future projects.

data review meetings

Budget & Cost Tracking

Advanced

For larger projects, keep notes on budget allocations, actual spend, and potential cost overruns.

requirements gathering

Communication Plan Notes

Beginner

Outline who needs to be informed, when, and through what channels for different project updates or issues.

stakeholder presentations

Dashboard & Reporting Notes

Dashboard Objective & Audience

Beginner

Before building, clearly define the primary purpose of the dashboard and its target audience.

dashboard walkthroughs

Key Performance Indicators (KPIs)

Intermediate

List all primary and secondary KPIs that the dashboard will track, along with their definitions and calculation logic.

dashboard walkthroughs

Data Sources & Refresh Frequency

Intermediate

Document where the data for each visual comes from and how often it is updated.

dashboard walkthroughs

Visual Design Choices Justification

Intermediate

Note down the reasons behind specific chart types, color palettes, or layout decisions.

dashboard walkthroughs

Filtering & Interactivity Options

Intermediate

Document all available filters, drill-down options, and other interactive elements.

dashboard walkthroughs

User Feedback & Iterations

Intermediate

Keep a log of all feedback received during walkthroughs and how it was addressed in subsequent iterations.

dashboard walkthroughs

Alerts & Anomaly Detection Logic

Advanced

If the dashboard includes alerts, document the thresholds and conditions that trigger them.

dashboard walkthroughs

Performance Optimization Notes

Advanced

Document any steps taken to improve dashboard loading times or query performance.

dashboard walkthroughs

Access Control & Permissions

Intermediate

Note down who has access to the dashboard and what level of permissions they have.

dashboard walkthroughs

Data Quality Checks Performed

Advanced

List the data quality checks implemented for the dashboard's underlying data sources.

dashboard walkthroughs

Narrative & Storytelling Flow

Intermediate

Plan out the story you want the dashboard to tell, guiding the user through key insights.

stakeholder presentations

Drill-Down Paths & Dependencies

Advanced

Map out the possible drill-down paths and which visuals or data points they are linked to.

dashboard walkthroughs

Definition of 'Normal' Performance

Intermediate

Establish and note down what constitutes 'normal' or 'expected' performance for each KPI.

dashboard walkthroughs

Assumptions in Reporting

Intermediate

Document any assumptions made during data aggregation or calculation that might impact the report's interpretation.

dashboard walkthroughs

Scheduled Distribution Details

Beginner

Note down who receives scheduled reports, the frequency, and the delivery method.

dashboard walkthroughs

Comparative Analysis Context

Intermediate

If comparing current data to historical periods or benchmarks, note the context and rationale for the comparison.

dashboard walkthroughs

Potential Misinterpretations

Advanced

Anticipate ways the dashboard data might be misinterpreted and plan how to address them in explanations.

stakeholder presentations

Future Enhancements Wishlist

Intermediate

Keep a running list of desired features or improvements for the dashboard based on evolving needs.

dashboard walkthroughs

Data Governance Implications

Advanced

Note any implications for data governance, such as data ownership, stewardship, or retention policies.

dashboard walkthroughs

User Training Material Outline

Intermediate

If the dashboard is complex, outline key points for a user training session or documentation.

dashboard walkthroughs

General Data Science Workflow Notes

IDE/Environment Configuration

Beginner

Document your specific IDE setup, extensions, and any custom environment variables for consistency.

experiment tracking

Code Snippet Library

Beginner

Maintain a personal library of reusable code snippets for common data cleaning, visualization, or modeling tasks.

experiment tracking

Troubleshooting Log

Intermediate

Whenever you encounter and solve a technical problem, document the problem, the steps taken, and the solution.

experiment tracking

Literature Review Summaries

Advanced

For new techniques or models, summarize key papers, their methodologies, and relevant findings.

model review sessions

Concept Explanations (for self)

Beginner

Write down simplified explanations of complex algorithms or statistical concepts to solidify your understanding.

model review sessions

Data Exploration Insights

Intermediate

During initial data exploration, note down interesting patterns, outliers, or potential data quality issues.

data review meetings

API Documentation Notes

Intermediate

When working with APIs, keep notes on endpoints, required parameters, and expected response formats.

experiment tracking

SQL Query Library

Beginner

Store frequently used or complex SQL queries with comments explaining their purpose.

data review meetings

Personal Learning Goals & Progress

Beginner

Track your personal development goals in data science, including resources studied and skills acquired.

general

Tool Comparison Notes

Intermediate

When evaluating different tools (e.g., visualization libraries, ML frameworks), note their pros and cons for specific tasks.

general

Deployment Checklist

Advanced

Create a checklist of all steps required to deploy a model or data product to production.

model review sessions

Monitoring Strategy Notes

Advanced

Document how a deployed model or dashboard will be monitored for performance degradation or data drift.

model review sessions

Ethical AI Considerations

Advanced

Maintain a running list of ethical considerations relevant to your projects, including fairness, transparency, and accountability.

model review sessions

Mentorship/Peer Discussion Insights

Beginner

Note down key advice, insights, or alternative perspectives gained from discussions with mentors or peers.

general

Future Research Directions

Intermediate

During analysis, jot down ideas for future research, deeper dives, or entirely new projects.

experiment tracking

Data Storytelling Frameworks

Intermediate

Experiment with and document different frameworks or structures for presenting data-driven narratives.

stakeholder presentations

Automated Testing Scenarios

Advanced

For critical data pipelines or models, outline scenarios for automated testing to ensure robustness.

experiment tracking

Knowledge Graph of Concepts

Advanced

Visually map out how different data science concepts, tools, or projects relate to each other.

general

Personal Productivity Hacks

Beginner

Note down any personal strategies or tools that help you manage your data science workflow more efficiently.

general

Impact vs. Effort Matrix

Intermediate

For potential projects or features, use a simple matrix to quickly assess their potential impact versus the effort required.

requirements gathering

💡 Pro Tips

  • Use a version-controlled notebook (e.g., Jupyter with Git) for experiment tracking to maintain a clear history of your analysis.
  • Adopt a 'template-first' approach for meeting notes, including sections for 'Agenda', 'Decisions', 'Action Items', and 'Open Questions' to ensure consistency.
  • For stakeholder presentations, practice explaining complex technical concepts using analogies and note down which ones resonate best with your audience.
  • Implement a structured tagging system for your notes (e.g., #experiment_name, #meeting_type, #action_item) to easily retrieve specific information later.
  • When reviewing models, always document the business context and potential real-world implications of false positives/negatives alongside technical metrics.

Frequently Asked Questions

Try CraftNote for Free

AI-powered transcription and meeting notes — 90+ languages, speaker identification, instant summaries.

Start for Free