NSERC Discovery grant applications are judged according to four criteria: (1) Excellence of the researcher, (2) Merit of the proposal, (3) Contribution to the training of HQP, and (4) Cost of research. Each criterion has six possible merit indicators: Exceptional, Outstanding, Very strong, Strong, Moderate, and Insufficient. This presentation describes the process from a candidate's point of view and a reviewer's point of view. It discusses funding decisions, including bins and ER vs. ECR. It gives some advice, including graduating PhD students, having a story, and limiting the number of main objectives.
Planning, Programming, Budgeting, System will be very useful for many organizations. It is zero-based planning. The system is used to evaluate the outcome.
The document discusses key aspects of consulting proposals. It explains that a proposal is a written document detailing a consultant's ideas, planning, and methodology for carrying out an assignment. Proposals are prepared in response to a Request for Proposal and include technical and financial components. The technical proposal outlines the proposed approach, methodology, staffing plan, and experiences while the financial proposal specifies costs. The document provides guidance on creating comprehensive proposals that meet a client's needs and stand out among competition.
The document discusses the growing demand for project managers and the Project Management Professional (PMP) certification. It provides statistics on PMP certification holders and salaries. It outlines the requirements to sit for the PMP exam, including education, experience, and training hours required. It describes the exam format, content areas, and costs. It also discusses ongoing continuing education and recertification requirements to maintain the PMP certification.
Bid and Tender Process Training PresentationMichael Lee
The document outlines a 7-phase bid process for tendering projects, including opportunity assessment, pre-proposal planning, proposal development through writing and review, final production and submission, post-submission presentation preparation, and finally award, closeout and handover if won. It describes assembling a multi-disciplinary bid team, conducting requirement and competitor analyses, developing a capture plan and win strategy through workshops and reviews, and getting final approvals before submission and potential presentation and handover. The detailed process is meant to help organizations systematically pursue opportunities while managing risks.
This document provides guidance for experts evaluating proposals submitted for the Horizon 2020 (H2020) program. It outlines the objectives, roles and responsibilities of independent experts in evaluating proposals against the specified criteria of excellence, impact, and quality of implementation. Experts are responsible for carrying out impartial evaluations of assigned proposals within deadlines and reporting any misconduct. The evaluation criteria and process, including consensus reports, panel reviews, and proposal prioritization are also described.
This document provides information about an induction workshop for an NVQ programme in customer service and communication. It outlines the structure and content of the workshop, including introductions, an overview of the NVQ programme and qualifications, and the levels and requirements for level 2 and level 3 certificates. It also details the units, workshops, and assessment process involved in achieving the NVQ, and next steps for participants to take, such as deciding on their level and beginning to collect evidence.
Planning, Programming, Budgeting, System will be very useful for many organizations. It is zero-based planning. The system is used to evaluate the outcome.
The document discusses key aspects of consulting proposals. It explains that a proposal is a written document detailing a consultant's ideas, planning, and methodology for carrying out an assignment. Proposals are prepared in response to a Request for Proposal and include technical and financial components. The technical proposal outlines the proposed approach, methodology, staffing plan, and experiences while the financial proposal specifies costs. The document provides guidance on creating comprehensive proposals that meet a client's needs and stand out among competition.
The document discusses the growing demand for project managers and the Project Management Professional (PMP) certification. It provides statistics on PMP certification holders and salaries. It outlines the requirements to sit for the PMP exam, including education, experience, and training hours required. It describes the exam format, content areas, and costs. It also discusses ongoing continuing education and recertification requirements to maintain the PMP certification.
Bid and Tender Process Training PresentationMichael Lee
The document outlines a 7-phase bid process for tendering projects, including opportunity assessment, pre-proposal planning, proposal development through writing and review, final production and submission, post-submission presentation preparation, and finally award, closeout and handover if won. It describes assembling a multi-disciplinary bid team, conducting requirement and competitor analyses, developing a capture plan and win strategy through workshops and reviews, and getting final approvals before submission and potential presentation and handover. The detailed process is meant to help organizations systematically pursue opportunities while managing risks.
This document provides guidance for experts evaluating proposals submitted for the Horizon 2020 (H2020) program. It outlines the objectives, roles and responsibilities of independent experts in evaluating proposals against the specified criteria of excellence, impact, and quality of implementation. Experts are responsible for carrying out impartial evaluations of assigned proposals within deadlines and reporting any misconduct. The evaluation criteria and process, including consensus reports, panel reviews, and proposal prioritization are also described.
This document provides information about an induction workshop for an NVQ programme in customer service and communication. It outlines the structure and content of the workshop, including introductions, an overview of the NVQ programme and qualifications, and the levels and requirements for level 2 and level 3 certificates. It also details the units, workshops, and assessment process involved in achieving the NVQ, and next steps for participants to take, such as deciding on their level and beginning to collect evidence.
The document discusses a workshop for program evaluators (PEVs) organized by the Board of Accreditation for Engineering and Technical Education (BAETE) in Dhaka. The workshop covers several interactive sessions to help PEVs design assessment forms and schedule on-site visits. It emphasizes evaluating programs based on BAETE's 11 accreditation criteria, with a focus on assessing attainment of program outcomes and continuous quality improvement. The document provides guidance on what PEVs should look for during visits and how outcomes should be assessed to determine compliance with accreditation standards.
Freelancer Magento Experts model allows you to hire designers on full-time, part-time or hourly basis. This model can save you at least 30% of cost against any fixed price quotes calculated even at $15/hr.
Unlike the freelancing project portals, we are accountable for the results and ensure that only qualified designers work for you. The designers that you hire will be our full-time employees and they will work exclusively on your assignments for the duration of the contract. It's the easiest way to get design projects done. We guarantee it!
We are dedicated to help our clients continually grow and succeed in their online business and we incorporate this dedication into every thread of what we do. Our values aren't just something we list on our site. We believe in them. Recruit by them. Review by them. And work according to them.
Our numbers speak for themselves. With a proven track record to deliver creative, robust and most importantly, increasingly profitable eCommerce sites, our diverse group of experts are committed to customer satisfaction and project excellence.
The document provides guidance for writing successful grant applications. It outlines important tips such as reading all instructions and guidance documents, writing a clear and compelling proposal that establishes the significance and impact of the research, and understanding how the application will be assessed. Reviewers will evaluate the quality, importance, people, resources, outputs, dissemination, and impact, so applicants should address these areas and anticipate any questions. It is important to choose the right funding scheme and communicate the research argument succinctly and effectively.
Tracking and Assessing Vocational QualificationsJohn Gordon
- Opus Learning develops online open courses to SQA HND standards for direct students and as a white label service. It provides the learning environment, content, and assessments.
- Tracking student interactions and progress is important for assessment, authentication, and identifying issues. Opus uses its learning platform and embedded tracking in course content to monitor students.
- Managing assessment load is key. Opus integrates assessments across units to reduce workload while maintaining standards. It also applies lessons from MOOCs to control costs through scale and automated assessment where possible.
ntroduction to Project Management Level 4.pptxInnoversity1
The level 4 project management apprenticeship is an excellent qualification for personal and career development. This qualification includes the APM Project Management Qualification.
This document provides an overview of the Level 4 Apprenticeship in Project Management. It describes that the apprenticeship is aimed at those working in project management roles and offers the opportunity to train while gaining paid employment. Apprentices will complete on-program learning and assessment, including a Project Management Qualification and Functional Skills in Math and English. They will also produce a portfolio of work-based evidence. To complete the apprenticeship, apprentices must pass an end-point assessment consisting of a presentation, portfolio, and professional discussion with an independent assessor panel. Successful apprentices will earn a Level 4 Apprenticeship in Project Management.
The Planning Quality Framework is a collection of tools and techniques that use planning data to help councils understand their development management service performance and benchmark against others. It involves quantitative data like application counts and approval rates, as well as qualitative customer surveys. The framework provides regular reports to give councils insights into the value and quality of their work. It is a low-effort way to focus improvement efforts compared to traditional benchmarking approaches.
This presentation explains practical issues faced by vendors while bidding for a task from the government.
Please feel free to use the comment section to add more.
This document provides tips for entering the PRCA Medallion Awards competition, which recognizes excellence in public relations. It discusses the award tiers based on project length, why professionals should enter to gain recognition and training, and how to select projects to submit. Detailed instructions are provided on gathering materials, writing the entry in the four-step PR process format of research, planning, implementation and evaluation, and dressing up the collateral. The deadline, categories, and judging process are also outlined.
This document outlines key aspects of outcome-based education (OBE) and the accreditation process for engineering programs. It discusses OBE principles like focusing on what students learn rather than what is taught. The document also describes international accords for engineers, technologists, and technicians. It provides details on curriculum review, teaching methods, assessment tools, and continuous quality improvement in OBE. Finally, it lists the documentation required for accreditation visits, including program outcomes, course files, facilities, and actions taken on previous deficiencies.
BizTrans Systems & Technologies Pvt. Ltd. is an IT consulting firm that provides business transformation and engineering services. It offers solutions in analytics, enterprise applications, and engineering using technologies like SAP, Oracle, AutoCAD, and ANSYS. The company aims to operate development teams as businesses to improve delivery and productivity for clients. It also combines engineering, analytics, and enterprise capabilities across various industries. BizTrans has a talent enablement program that provides training, internships, and placement assistance to help trainees gain skills and experience working on projects.
This document provides an overview of procurement processes and methods. It defines key terms like goods, services, works, suppliers, bidders. It then discusses the differences between purchasing and procurement. The main methods of procurement covered are international competitive bidding, national competitive bidding, limited bidding, shopping, and direct purchase. It also covers procurement of consulting services and the main selection methods for consultants like quality and cost-based selection and quality-based selection. The document provides details on the steps involved in each selection method.
This document provides an introduction and overview of a course on software quality management. It outlines the course information, including the lecture topics, assignments, literature, and assessment. The course aims to teach quality basics, process improvement, product and project quality management, and defect prevention. Students will complete three assignments in groups and an individual exam. The assignments focus on code reviews, quality measurement, and process improvement. Overall, the document establishes the expectations and requirements for successfully completing the software quality management course.
The document discusses Macquarie University's HDR thesis examination process. It explains that PhD theses require examination by 3 external examiners while MPhil theses require 2. Examiners submit written reports and may recommend awarding the degree, awarding with corrections, revision and resubmission, or not awarding. Supervisors nominate examiners in consultation with students. Examiners should be significant figures in the field but not personally close to the student or supervisors. The examination process involves examiner reports, faculty responses, consideration by committees, and potential revisions by the student.
PLAN ASSESSMENT ACTIVITIES AND PROCESSESSurono Surono
The document provides guidance on planning assessment activities and processes for assessor training. It outlines four main steps:
1. Determine the assessment approach by identifying the candidate, purposes of assessment, context, and relevant benchmarks.
2. Prepare an assessment plan by determining required evidence, selecting appropriate assessment methods, and documenting the plan.
3. Identify any modification or contextualization needs based on the candidate and workplace.
4. Develop assessment instruments by drafting tools, mapping them to requirements, and trialing the instruments.
The document provides details on each step, emphasizing the importance of validity, reliability, flexibility and fairness in assessment. It also references ASEAN standards and guidelines to help ensure quality and
Apprenticeship Standards - Team Forum Nov 17Samuel Fosbery
Dimple Khagram, Managing Director of Skill-Serve Training, led a discussion about the transition of apprenticeship to the new standards. What these changes mean for colleges and training providers and how to successfully implement the reforms.
The document discusses various aspects of project initiation and feasibility studies. It describes the types of feasibility studies conducted including technical, economic, environmental and marketing feasibility. It also outlines the project selection criteria and techniques used such as net present value, internal rate of return and payback period. The stages of engineering design from conceptual to detailed engineering are defined.
The document provides guidance on how to write an individual evaluation report (IER) for Horizon 2020 proposals. It outlines the key steps which include: checking for conflicts of interest; reading the call/topic description; understanding the expected structure and content of proposals; evaluating proposals against the criteria of excellence, impact, and quality of implementation; providing scored assessments and comments in the evaluation tool; and considering tips for the final IER writing. Evaluators are instructed to provide a factual assessment based on the written proposal without external input or recommendations. The annex includes an example dummy IER covering the three main evaluation criteria.
Some Pitfalls with Python and Their Possible Solutions v1.0Yann-Gaël Guéhéneuc
Python is a very popular programming language that comes with many pitfalls. This presentation describes some of these pitfalls, especially when they could trick unsuspecting object-oriented developers. It proposes solutions to these pitfalls, in particular regarding inheritance, which is easily broken because of the implementation choice of Python for explicit delegation, its method resolution order, and its use of the C3 algorithm. It discusses some advantages of using Python, especially regarding meta-classes.
Ptidej Architecture, Design, and Implementation in Action v2.1Yann-Gaël Guéhéneuc
A set of process, architecture, design, and implementation patterns from a real, large program, the Ptidej Tool Suite. This set shows concrete problems and their solutions in Java. It includes: Be A Profiler, Tests as Documentation, Multi-layered Architecture, Proxy Console, Proxy Disk, Hidden Language, Internal Observer, Run-time Deprecation, String Parsimony, Object Identity, Object Address, Final Construction, StringBuffer as Positioning Element.
Weitere ähnliche Inhalte
Ähnlich wie Advice for writing a NSERC Discovery grant application v0.5
The document discusses a workshop for program evaluators (PEVs) organized by the Board of Accreditation for Engineering and Technical Education (BAETE) in Dhaka. The workshop covers several interactive sessions to help PEVs design assessment forms and schedule on-site visits. It emphasizes evaluating programs based on BAETE's 11 accreditation criteria, with a focus on assessing attainment of program outcomes and continuous quality improvement. The document provides guidance on what PEVs should look for during visits and how outcomes should be assessed to determine compliance with accreditation standards.
Freelancer Magento Experts model allows you to hire designers on full-time, part-time or hourly basis. This model can save you at least 30% of cost against any fixed price quotes calculated even at $15/hr.
Unlike the freelancing project portals, we are accountable for the results and ensure that only qualified designers work for you. The designers that you hire will be our full-time employees and they will work exclusively on your assignments for the duration of the contract. It's the easiest way to get design projects done. We guarantee it!
We are dedicated to help our clients continually grow and succeed in their online business and we incorporate this dedication into every thread of what we do. Our values aren't just something we list on our site. We believe in them. Recruit by them. Review by them. And work according to them.
Our numbers speak for themselves. With a proven track record to deliver creative, robust and most importantly, increasingly profitable eCommerce sites, our diverse group of experts are committed to customer satisfaction and project excellence.
The document provides guidance for writing successful grant applications. It outlines important tips such as reading all instructions and guidance documents, writing a clear and compelling proposal that establishes the significance and impact of the research, and understanding how the application will be assessed. Reviewers will evaluate the quality, importance, people, resources, outputs, dissemination, and impact, so applicants should address these areas and anticipate any questions. It is important to choose the right funding scheme and communicate the research argument succinctly and effectively.
Tracking and Assessing Vocational QualificationsJohn Gordon
- Opus Learning develops online open courses to SQA HND standards for direct students and as a white label service. It provides the learning environment, content, and assessments.
- Tracking student interactions and progress is important for assessment, authentication, and identifying issues. Opus uses its learning platform and embedded tracking in course content to monitor students.
- Managing assessment load is key. Opus integrates assessments across units to reduce workload while maintaining standards. It also applies lessons from MOOCs to control costs through scale and automated assessment where possible.
ntroduction to Project Management Level 4.pptxInnoversity1
The level 4 project management apprenticeship is an excellent qualification for personal and career development. This qualification includes the APM Project Management Qualification.
This document provides an overview of the Level 4 Apprenticeship in Project Management. It describes that the apprenticeship is aimed at those working in project management roles and offers the opportunity to train while gaining paid employment. Apprentices will complete on-program learning and assessment, including a Project Management Qualification and Functional Skills in Math and English. They will also produce a portfolio of work-based evidence. To complete the apprenticeship, apprentices must pass an end-point assessment consisting of a presentation, portfolio, and professional discussion with an independent assessor panel. Successful apprentices will earn a Level 4 Apprenticeship in Project Management.
The Planning Quality Framework is a collection of tools and techniques that use planning data to help councils understand their development management service performance and benchmark against others. It involves quantitative data like application counts and approval rates, as well as qualitative customer surveys. The framework provides regular reports to give councils insights into the value and quality of their work. It is a low-effort way to focus improvement efforts compared to traditional benchmarking approaches.
This presentation explains practical issues faced by vendors while bidding for a task from the government.
Please feel free to use the comment section to add more.
This document provides tips for entering the PRCA Medallion Awards competition, which recognizes excellence in public relations. It discusses the award tiers based on project length, why professionals should enter to gain recognition and training, and how to select projects to submit. Detailed instructions are provided on gathering materials, writing the entry in the four-step PR process format of research, planning, implementation and evaluation, and dressing up the collateral. The deadline, categories, and judging process are also outlined.
This document outlines key aspects of outcome-based education (OBE) and the accreditation process for engineering programs. It discusses OBE principles like focusing on what students learn rather than what is taught. The document also describes international accords for engineers, technologists, and technicians. It provides details on curriculum review, teaching methods, assessment tools, and continuous quality improvement in OBE. Finally, it lists the documentation required for accreditation visits, including program outcomes, course files, facilities, and actions taken on previous deficiencies.
BizTrans Systems & Technologies Pvt. Ltd. is an IT consulting firm that provides business transformation and engineering services. It offers solutions in analytics, enterprise applications, and engineering using technologies like SAP, Oracle, AutoCAD, and ANSYS. The company aims to operate development teams as businesses to improve delivery and productivity for clients. It also combines engineering, analytics, and enterprise capabilities across various industries. BizTrans has a talent enablement program that provides training, internships, and placement assistance to help trainees gain skills and experience working on projects.
This document provides an overview of procurement processes and methods. It defines key terms like goods, services, works, suppliers, bidders. It then discusses the differences between purchasing and procurement. The main methods of procurement covered are international competitive bidding, national competitive bidding, limited bidding, shopping, and direct purchase. It also covers procurement of consulting services and the main selection methods for consultants like quality and cost-based selection and quality-based selection. The document provides details on the steps involved in each selection method.
This document provides an introduction and overview of a course on software quality management. It outlines the course information, including the lecture topics, assignments, literature, and assessment. The course aims to teach quality basics, process improvement, product and project quality management, and defect prevention. Students will complete three assignments in groups and an individual exam. The assignments focus on code reviews, quality measurement, and process improvement. Overall, the document establishes the expectations and requirements for successfully completing the software quality management course.
The document discusses Macquarie University's HDR thesis examination process. It explains that PhD theses require examination by 3 external examiners while MPhil theses require 2. Examiners submit written reports and may recommend awarding the degree, awarding with corrections, revision and resubmission, or not awarding. Supervisors nominate examiners in consultation with students. Examiners should be significant figures in the field but not personally close to the student or supervisors. The examination process involves examiner reports, faculty responses, consideration by committees, and potential revisions by the student.
PLAN ASSESSMENT ACTIVITIES AND PROCESSESSurono Surono
The document provides guidance on planning assessment activities and processes for assessor training. It outlines four main steps:
1. Determine the assessment approach by identifying the candidate, purposes of assessment, context, and relevant benchmarks.
2. Prepare an assessment plan by determining required evidence, selecting appropriate assessment methods, and documenting the plan.
3. Identify any modification or contextualization needs based on the candidate and workplace.
4. Develop assessment instruments by drafting tools, mapping them to requirements, and trialing the instruments.
The document provides details on each step, emphasizing the importance of validity, reliability, flexibility and fairness in assessment. It also references ASEAN standards and guidelines to help ensure quality and
Apprenticeship Standards - Team Forum Nov 17Samuel Fosbery
Dimple Khagram, Managing Director of Skill-Serve Training, led a discussion about the transition of apprenticeship to the new standards. What these changes mean for colleges and training providers and how to successfully implement the reforms.
The document discusses various aspects of project initiation and feasibility studies. It describes the types of feasibility studies conducted including technical, economic, environmental and marketing feasibility. It also outlines the project selection criteria and techniques used such as net present value, internal rate of return and payback period. The stages of engineering design from conceptual to detailed engineering are defined.
The document provides guidance on how to write an individual evaluation report (IER) for Horizon 2020 proposals. It outlines the key steps which include: checking for conflicts of interest; reading the call/topic description; understanding the expected structure and content of proposals; evaluating proposals against the criteria of excellence, impact, and quality of implementation; providing scored assessments and comments in the evaluation tool; and considering tips for the final IER writing. Evaluators are instructed to provide a factual assessment based on the written proposal without external input or recommendations. The annex includes an example dummy IER covering the three main evaluation criteria.
Ähnlich wie Advice for writing a NSERC Discovery grant application v0.5 (20)
Some Pitfalls with Python and Their Possible Solutions v1.0Yann-Gaël Guéhéneuc
Python is a very popular programming language that comes with many pitfalls. This presentation describes some of these pitfalls, especially when they could trick unsuspecting object-oriented developers. It proposes solutions to these pitfalls, in particular regarding inheritance, which is easily broken because of the implementation choice of Python for explicit delegation, its method resolution order, and its use of the C3 algorithm. It discusses some advantages of using Python, especially regarding meta-classes.
Ptidej Architecture, Design, and Implementation in Action v2.1Yann-Gaël Guéhéneuc
A set of process, architecture, design, and implementation patterns from a real, large program, the Ptidej Tool Suite. This set shows concrete problems and their solutions in Java. It includes: Be A Profiler, Tests as Documentation, Multi-layered Architecture, Proxy Console, Proxy Disk, Hidden Language, Internal Observer, Run-time Deprecation, String Parsimony, Object Identity, Object Address, Final Construction, StringBuffer as Positioning Element.
Examples of (bad) consequences of a lack of software quality and some solutions. This presentation presents some examples of (bad) consequences of a lack of software quality, in particular how poor software quality led to the direct deaths of 89 people. It then provides some background on software quality, especially the concept of Quality Without a Name. It then discusses many principles, their usefulness, and their positive consequences on software quality. Some of these principles are well-known in object-oriented programming while many others are taken from the book 97 Programmers. They include: abstraction, encapsulation, inheritance, types, polymorphism, SOLID, GRASP, YAGNI, KISS, DRY, Do Not Reinvent the Wheel, Law of Demeter, Beware of Assumptions, Deletable Code, coding with reason, and functional programming. They pertain to dependencies, domains, and tools.
(In details: Beautify is Simplicity, The Boy Scout Rule, You Gotta Care About the Code, The Longevity of Interim Solutions, Beware the Share, Encapsulate Behaviour not Just State, Single Responsibility Principle, WET Dilutes Performance Bottlenecks, Convenience Is Not an -ility, Code in the Language of the Domain, Comment Only What the Code Cannot Say, Distinguish Business Exception from Technical, Prefer Domain-specific Types to Primitive Types, Automate Your Coding Standards, Code Layout Matters, Before You Refactor, Improve Code by Removing It, Put the Mouse Down and Step Away from the Keyboard)
Some Pitfalls with Python and Their Possible Solutions v0.9Yann-Gaël Guéhéneuc
Python is a very popular programming language that comes with many pitfalls. This presentation describes some of these pitfalls, especially when they could trick unsuspecting object-oriented developers. It proposes solutions to these pitfalls, in particular regarding inheritance, which is easily broken because of the implementation choice of Python for explicit delegation, its method resolution order, and its use of the C3 algorithm. It discusses some advantages of using Python, especially regarding meta-classes.
An Explanation of the Unicode, the Text Encoding Standard, Its Usages and Imp...Yann-Gaël Guéhéneuc
Unicode is currently the world standard for encoding text. It supports all of the world's major writing systems. With its version 15.1 of 2023/09/12, it defines 149,813 characters and 161 scripts. This presentation starts with the, seemingly, simple example of the polar bear emoji. It then defines the key terms of any such standard. It then asks how a software system can render orthographic characters into glyphs, i.e., to render characters into (combined) glyphs. It introduces the concept of abstract characters and describes a brief history of encoding standards, from ASCII to Unicode. It shows how, by adding one level of indirection, the Unicode standard answers this question. It then presents code examples to display text written in Unicode: HarfBuzz (for shaping) and FreeType (for rendering).
An Explanation of the Halting Problem and Its ConsequencesYann-Gaël Guéhéneuc
The halting problem is an important, famous, and consequential problem in computer science. It is about writing a program that decides if another problem will stop. There is no general solution to this problem, which shows that such a problem is undecidable, with important consequences: for example, it is not possible to write tests that would exhaustively test entirely an arbitrary program. This presentation was written in collaboration with <a href="https://www.iro.umontreal.ca/~hahn/">Gena Hahn</a>.
A presentation summarising FPGAs, their history, their benefits, and showing how to program them. It provides some historical background on the development of computers, from the Difference Engine to the Intel 4004 to the AMD Ryzen Threadripper PRO 3995WX. It shows how the number of transistors increased dramatically but also how this increase led to more complexity and more bugs. It then introduces Field-programmable gate arrays (FPGA) as an alternative. It then presents how to program such FPGA using data-flow graphs. It discusses some tools (Yosys, NextPnR, and IceStorm) and illustrates them with a typical "Hello World" (i.e., blinking an LED) using Cygwin on Windows 10.
A set of brief presentations of some of the women and men who made the history of computer science and software engineering.
- 1936: Alan Turing
- 1948: Claude Elwood Shannon
- 1950: Grace Murray Hopper
- 1960: John McCarthy
- 1966: Frances E. Allen
- 1967: Ole-Johan Dahl
- 1967: Kristen Nygaard
- 1969: Charles A. R. Hoare
- 1970: Edgar F. Codd
- 1972: Dave Parnas
- 1974: Manny Lehman
- 1975: Frederick Brooks
- 1986: Edward Yourdon
- 1987: Barbara Liskov
- 1994: Erich Gamma
- 1997: Grady Booch
- 2001: Butler Lampson
A tutorial on the history, use, and caveats of Java generics. Using the simple example of an interface for sort algorithms, the tutorial presents the history of generics and describes the problems being solved by generics. It also provides definitions, and examples in Java and C++, and discusses Duck Typing. It then describes two scenarios: (1) Scenario 1: you want to enforce type safety for containers and remove the need for typecasts when using these containers and (2) Scenario 2: you want to build generic algorithms that work on several types of (possibly unrelated) things. It also summarises caveats with generics, in particular type erasure.
A tutorial on reflection, with a particular emphasis on Java, with a comparison with C++, Python, and Smalltalk. It describes different scenarios in which reflection is useful, a brief history of reflection and MOPs, a comparison with C++, Python, and Smalltalk, and some particulars about Java. The source code of the examples in Java (Eclipse project), Smalltalk (Squeak image v3.10.6), Python (Eclipse project), and C++ (Eclipse projects and Visual Studio solution) are available. (C++ Eclipse projects require Mirror.) Big thanks to Matúš Chochlík and Marcus Denker for their kind and precious help with C++ and Smalltalk.
The tutorial focuses on four common problems:
- Avoid using instanceof when code must bypass the compiler and virtual machine’s choice of the method to call.
- Create external, user-defined pieces of code loaded, used, and unloaded at run-time.
- Translate data structures or object states into a format that can be stored (file, network...).
- Monitor the execution of a program to understand its behaviour, and measure its space and time complexity.
It shows working examples of Java, Smalltalk, Python, and C++ code solving the four common problems through four scenarios:
- Scenario 1: invoke an arbitrary method on an object (see the problems with instanceof and plugins).
- Scenario 2: access the complete (including private) state of an object (see the problem with serialisation).
- Scenario 3: count the number of instances of a class created at runtime (see the problem with debugging/profiling).
- Scenario 4: patch the method of a class to change its behaviour (see the problem with patching).
It also discusses the different kinds of interconnections among objects that are available in common programming languages (linking, forking, subclassing, inter-process communication, and dynamic loading/invoking), a bit of theory about reflection, and specifically the class-loading mechanism of Java.
REST APIs are nowadays the de-facto standard for Web applications. However, as more systems and services adopt the REST architectural style, many problems arise regularly. To avoid these repetitive problems, developers should follow good practices and avoid bad practices. Thus, research on good and bad practices and how to design a simple but effective REST API are essential. Yet, to the best of our knowledge, there are only a few concrete solutions to recurring REST API practices, like “API Versioning”. There are works on defining or detecting some practices, but not on solutions to the practices. We present the most up-to-date list of REST API practices and formalize them in the form of REST API (anti)patterns. We validate our design (anti)patterns with a survey and interviews of 55 developers.
Analyzing and Visualizing Projects and their Relations in Software Ecosystems presents an approach to help developers understand and navigate between projects in related software ecosystems. The approach generates word clouds from project documentation to summarize projects. It then maps relationships between word clouds to identify related projects. This allows developers to better understand the scope and connections between projects within software ecosystems.
This document presents an approach for automatically identifying antipatterns in microservice-based systems. It defines a meta-model with 13 components to capture necessary information about a system and its microservices. It also identifies 15 common microservice antipatterns. Detection rules are defined for each antipattern based on analyzing the system's source code, dependencies, configuration and other artifacts. The goal is to develop a tool based on this approach to help developers minimize antipatterns in microservice systems and improve their maintenance and evolution.
The document presents a preliminary study comparing several open-source IoT development frameworks: Eclipse Vorto, ThingML, Node-red, and OpenHab. The researchers designed the study to evaluate the frameworks' ability to support basic IoT application requirements. They implemented examples from three common IoT application categories using each framework and analyzed the results. Overall, Node-red required the least effort while the other frameworks had more limitations. Future work could study how the frameworks complement each other and implement more complex examples.
This document describes a dataset of software engineering problems in video game development extracted from over 200 postmortems published between 1997 and 2019. The dataset contains 1,035 problems across 20 problem types and is intended to summarize developers' experiences and difficulties during game development. It is available on GitHub at the listed URL.
This document summarizes research into software engineering patterns for designing machine learning systems. A survey found that ML developers have little knowledge of applicable architecture and design patterns. A literature review identified 19 scholarly papers and 19 gray documents discussing practices. The research aims to classify ML patterns according to the typical ML pipeline process and software development lifecycle. It identifies 12 architecture patterns, 13 design patterns, and 8 anti-patterns for ML systems. Future work includes documenting the patterns fully and analyzing their impact on ML system quality attributes.
This document describes a type-sensitive service identification approach called ServiceMiner to support the migration of legacy systems to service-oriented architectures. ServiceMiner uses static analysis and detection rules tailored to specific service types to identify services. It was validated on an open-source ERP system, Compiere, where it achieved higher identification accuracy than other approaches and reduced the effort required to identify architecturally significant services. The approach automates service identification, allows prioritizing types, and is extensible to new technologies.
Practitioners surveyed had experience migrating legacy systems, mainly written in COBOL and Java, to SOA architectures. The top motivations for legacy-to-SOA migration were reducing maintenance costs and improving flexibility. Most common migration strategies were rehosting, re-architecture, or a mix. Service identification was seen as important and drew from sources like source code, business processes, and human expertise. Top-down and mixed approaches were commonly used. Functionality clustering and wrapping were the dominant service identification techniques. The process aimed to identify coarse-grained, high-value domain services rather than technical services. Full automation was not a primary focus due to challenges in predicting results for enterprise systems. Recommendations included
How Can Hiring A Mobile App Development Company Help Your Business Grow?ToXSL Technologies
ToXSL Technologies is an award-winning Mobile App Development Company in Dubai that helps businesses reshape their digital possibilities with custom app services. As a top app development company in Dubai, we offer highly engaging iOS & Android app solutions. https://rb.gy/necdnt
Need for Speed: Removing speed bumps from your Symfony projects ⚡️Łukasz Chruściel
No one wants their application to drag like a car stuck in the slow lane! Yet it’s all too common to encounter bumpy, pothole-filled solutions that slow the speed of any application. Symfony apps are not an exception.
In this talk, I will take you for a spin around the performance racetrack. We’ll explore common pitfalls - those hidden potholes on your application that can cause unexpected slowdowns. Learn how to spot these performance bumps early, and more importantly, how to navigate around them to keep your application running at top speed.
We will focus in particular on tuning your engine at the application level, making the right adjustments to ensure that your system responds like a well-oiled, high-performance race car.
SOCRadar's Aviation Industry Q1 Incident Report is out now!
The aviation industry has always been a prime target for cybercriminals due to its critical infrastructure and high stakes. In the first quarter of 2024, the sector faced an alarming surge in cybersecurity threats, revealing its vulnerabilities and the relentless sophistication of cyber attackers.
SOCRadar’s Aviation Industry, Quarterly Incident Report, provides an in-depth analysis of these threats, detected and examined through our extensive monitoring of hacker forums, Telegram channels, and dark web platforms.
Using Query Store in Azure PostgreSQL to Understand Query PerformanceGrant Fritchey
Microsoft has added an excellent new extension in PostgreSQL on their Azure Platform. This session, presented at Posette 2024, covers what Query Store is and the types of information you can get out of it.
8 Best Automated Android App Testing Tool and Framework in 2024.pdfkalichargn70th171
Regarding mobile operating systems, two major players dominate our thoughts: Android and iPhone. With Android leading the market, software development companies are focused on delivering apps compatible with this OS. Ensuring an app's functionality across various Android devices, OS versions, and hardware specifications is critical, making Android app testing essential.
Unveiling the Advantages of Agile Software Development.pdfbrainerhub1
Learn about Agile Software Development's advantages. Simplify your workflow to spur quicker innovation. Jump right in! We have also discussed the advantages.
E-commerce Development Services- Hornet DynamicsHornet Dynamics
For any business hoping to succeed in the digital age, having a strong online presence is crucial. We offer Ecommerce Development Services that are customized according to your business requirements and client preferences, enabling you to create a dynamic, safe, and user-friendly online store.
E-Invoicing Implementation: A Step-by-Step Guide for Saudi Arabian CompaniesQuickdice ERP
Explore the seamless transition to e-invoicing with this comprehensive guide tailored for Saudi Arabian businesses. Navigate the process effortlessly with step-by-step instructions designed to streamline implementation and enhance efficiency.
UI5con 2024 - Boost Your Development Experience with UI5 Tooling ExtensionsPeter Muessig
The UI5 tooling is the development and build tooling of UI5. It is built in a modular and extensible way so that it can be easily extended by your needs. This session will showcase various tooling extensions which can boost your development experience by far so that you can really work offline, transpile your code in your project to use even newer versions of EcmaScript (than 2022 which is supported right now by the UI5 tooling), consume any npm package of your choice in your project, using different kind of proxies, and even stitching UI5 projects during development together to mimic your target environment.
Most important New features of Oracle 23c for DBAs and Developers. You can get more idea from my youtube channel video from https://youtu.be/XvL5WtaC20A
Flutter is a popular open source, cross-platform framework developed by Google. In this webinar we'll explore Flutter and its architecture, delve into the Flutter Embedder and Flutter’s Dart language, discover how to leverage Flutter for embedded device development, learn about Automotive Grade Linux (AGL) and its consortium and understand the rationale behind AGL's choice of Flutter for next-gen IVI systems. Don’t miss this opportunity to discover whether Flutter is right for your project.
WWDC 2024 Keynote Review: For CocoaCoders AustinPatrick Weigel
Overview of WWDC 2024 Keynote Address.
Covers: Apple Intelligence, iOS18, macOS Sequoia, iPadOS, watchOS, visionOS, and Apple TV+.
Understandable dialogue on Apple TV+
On-device app controlling AI.
Access to ChatGPT with a guest appearance by Chief Data Thief Sam Altman!
App Locking! iPhone Mirroring! And a Calculator!!
Advice for writing a NSERC Discovery grant application v0.5
1. Yann-Gaël Guéhéneuc
Département de génie informatique et de génie logiciel
This work is licensed under a Creative
Commons Attribution-NonCommercial-
ShareAlike 3.0 Unported License
NSERC
DG Advices
yann-gael.gueheneuc@polytmtl.ca
Version 0.5
2013/07/07
3. 3/89
Disclaimer: I cannot be held responsible for
the failure or success of your applications,
were you to follow or not these advices
4. 4/89
NSERC DG Advices
Each NSERC DG application is evaluated
according to 4 criteria and 6 merit indicators
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Cost of research
5. 5/89
NSERC DG Advices
Each NSERC DG application is evaluated
according to 4 criteria and 6 merit indicators
– Exceptional
– Outstanding
– Very strong
– Strong
– Moderate
– Insufficient
6. 6/89
NSERC DG Advices
How are these criteria rated by the reviewers
using the indicators?
How to ease the reviewers’ jobs?
And, how to be possibly more successful?
7. 7/89
Outline
Process in a nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
Conclusion
Further readings
8. 8/89
Outline
Process in a nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
Conclusion
Further readings
9. 9/89
Process in a Nutshell
From the candidate’s point of view
– August 1st: submission of Form 180
– November 1st: final submission of Forms 100,
101, and publications
– March/April: announcement of the results
10. 10/89
Process in a Nutshell
From the internal reviewer’s point of view
– Two main parts
• Off-line work, e-mails/readings
• Competition week in Ottawa
11. 11/89
Process in a Nutshell
Off-line work
– August 27th: reception of all the submissions
• In 2012, 322 submissions
– September 7th: ratings (expertise levels and
conflicts) of all the submissions
• High, Medium, Low, Very Low, Conflict, X (language)
– September 24th: final choice of the 1st internal
reviewers for each applications
• In 2012, 14 applications as 1st internal reviewer, 15
as 2nd internal reviewer, 17 as reader = 46
12. 12/89
Process in a Nutshell
Off-line work
– October 5th: choice by the 1st internal reviewer of
5 external referees
• In 2012, 14 applications = 70
• May include referees suggested by the candidate but
may also replace all of them
– October 22nd: ratings of applications from other
evaluation groups
• In 2012, 1 application
13. 13/89
Process in a Nutshell
Off-line work
– Early December: final list of readings
• In 2012, 47 applications
– January/February: reception of tge reports from
the external referees
• In 2012, 123 reports
– February 18th to 22nd: competition week in
Ottawa during which each application is
discussed and rated
14. 14/89
Process in a Nutshell
Off-line work
– In 2012 (and I suspect every year), a lot of work!
• 322 submissions
• 47 applications (including joint publications)
• 70 referees
• 123 referee reports
15. 15/89
Process in a Nutshell
Off-line work
– In 2012 (and I suspect every year), a lot of work!
• 322 submissions
• 47 applications (including joint publications)
• 70 referees
• 123 referee reports
Make it easier
for the reviewers
16. 16/89
Process in a Nutshell
Competition week
– February 18th to 22nd: competition week in
Ottawa during which each application is
discussed and rated
– 5 days
• In 2012 (and I suspect every year), very intense,
demanding, and tiring
17. 17/89
Process in a Nutshell
Competition day
– Starts at 8:30am
– Divides into
• 31 15-minute slots
• 2 15-minute breaks
• 1 45-minute lunch
– Ends at 5:15pm
• If no deferred applications to re-discuss
– In 2012, 1 application
18. 18/89
Process in a Nutshell
Competition slot
– In a 15-minute slot, the ratings of an application
are chosen by the reviewers
– Or the application is “deferred”, to be re-
discussed at the end of the day
19. 19/89
Process in a Nutshell
Competition slot
– 1st internal reviewer gives ratings with
justifications, which must be facts in the Forms
– 2nd internal reviewers contrasts, supports, adds
missing facts from the Forms
– The readers complement or challenge ratings
given by 1st and 2nd internal reviewers, must be
supported by facts from the Forms
20. 20/89
Process in a Nutshell
Competition slot
– 1st internal reviewer gives ratings with
justifications, which must be facts in the Forms
• In 2012, a typical presentation follow this pattern
– Candidate: career, funding, visibility, publications, HPQ
record, planned training
– Proposal: context, lacks, characteristics (Incremental?
Applicable? Feasible?)
– External: summary of the referees' reviews, summary of the
provided contributions
then, the reviewer would give his ratings
21. 21/89
Process in a Nutshell
Competition slot
– 1st internal reviewer gives ratings with
justifications, which must be facts in the Forms
• In 2012, a typical presentation follow this pattern
– Candidate: career, funding, visibility, publications, HPQ
record, planned training
– Proposal: context, lacks, characteristics (Incremental?
Applicable? Feasible?)
– External: summary of the referees' reviews, summary of the
provided contributions
then, the reviewer would give his ratings
Not exactly the
NSERC criteria
22. 22/89
Process in a Nutshell
Competition slot
– Session chair keeps the time strictly
– Session chairs makes sure that any
discussion sticks to the facts
23. 23/89
Process in a Nutshell
Competition slot
– Ratings are anonymous
• Secret electronic vote
• Session chair announce results
– Ratings are consensual
• If reviewers/readers strongly disagree, the application
will be re-discussed at the end of the day
– In 2012, I did not see any strong debates: mostly 1st and 2nd
internal reviewers agreed, backed-up by the readers
– In 2012, some facts were sometimes highlighted and ratings
were changed accordingly
24. 24/89
Process in a Nutshell
Competition slot
– Any criteria rated as moderate or insufficient
receive comments from the committee, reflecting
the consensus of the reviewers (highly focused)
• In 2012, NSERC provided typical comments, for
example: “The applicant did not take advantage of the
available space in Form 100 to make a compelling
case about his/her most significant research
contributions. Given the lack of information, the EG
was unable to carry out a thorough assessment and
potentially recommend a higher rating.”
25. 25/89
Outline
Process in a Nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
Conclusion
Further readings
26. 26/89
Funding Decisions
In a nutshell
– Each proposal is rated by the reviewers secretly
after the discussions
– The medians of the ratings are used for criteria
– For example
• Excellence of researcher: {S, S, M, M, M}, rating is M
• Merit of the proposal: {V, V, S, S, M}, rating is S
• Impact of HQP: {V, S, S, S, M}, rating is S
• The application rating is therefore {M, S, S}
27. 27/89
Funding Decisions
Bins
– The numeric “values” of the ratings are “added”
• For example, {M, S, S} 2+3+3 = 8
– The application is placed into one of 16 bins
– The bins are labelled A through to P and
correspond numerically to 18 down to 3
28. 28/89
Funding Decisions
Bins
– Bins A and P are uniquely mapped to {E, E, E}
and {I, I, I} while pther bins contain a mix of
numerically equivalent ratings, e.g., {V, S, M} is
in the same bin as {S, S, S} and {M, S, V}
• For example, the application rated {M, S, S} is in K
– Not all applications in a bin are funded: {S, S, S}
may be funded while {M, S, V} is not
• Because of the moderate indicator for the first criteria
– Cut-off point depends on year
29. 29/89
Funding Decisions
ER vs. ECR
– Candidates are divided into
• ER: established researchers, who already applied
(funded?) to NSERC DG
• ECR: early-career researchers, who apply to NSERC
DG for the first time
– ECR are funded one bin “lower” (better) than ER
30. 30/89
Outline
Process in a Nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
Conclusion
Further readings
31. 31/89
Criteria and Indicators
“Values” of criteria
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Cost of research
32. 32/89
Criteria and Indicators
“Values” of criteria
– Excellence of the researcher
• Knowledge, expertise and experience
• Quality of contributions to, and impact on, research
areas in the NSE
• Importance of contributions
33. 33/89
Criteria and Indicators
“Values” of criteria
– Merit of the proposal
• Originality and innovation
• Significance and expected contributions to research;
potential for technological impact
• Clarity and scope of objectives
• Methodology and feasibility
• Extent to which the scope of the proposal addresses
all relevant issues
• Appropriateness of, and justification for, the budget
• Relationship to other sources of funds
34. 34/89
Criteria and Indicators
“Values” of criteria
– Merit of the proposal
• Originality and innovation
• Significance and expected contributions to research;
potential for technological impact
• Clarity and scope of objectives
• Methodology and feasibility
• Extent to which the scope of the proposal addresses
all relevant issues
• Appropriateness of, and justification for, the budget
• Relationship to other sources of funds
Not really important
35. 35/89
Criteria and Indicators
“Values” of criteria
– Merit of the proposal
• Originality and innovation
• Significance and expected contributions to research;
potential for technological impact
• Clarity and scope of objectives
• Methodology and feasibility
• Extent to which the scope of the proposal addresses
all relevant issues
• Appropriateness of, and justification for, the budget
• Relationship to other sources of funds
Not really important
Amounts of previous
grants (in particular
NSERC DG) should
be ignored
36. 36/89
Criteria and Indicators
“Values” of criteria
– Contribution to the training HQP
• Quality and impact of contributions
• Appropriateness of the proposal for the training of
HQP in the NSE
• Enhancement of training arising from a collaborative
or interdisciplinary environment, where applicable
38. 38/89
Criteria and Indicators
“Values” of criteria
– Cost of research
• Rationale
Not really important
but you cannot have
more than what you
ask, no matter the merit
39. 39/89
Criteria and Indicators
“Meanings” of indicators
– Exceptional
– Outstanding
– Very strong
– Strong
– Moderate
– Insufficient
40. 40/89
Criteria and Indicators
“Meanings” of indicators
– Exceptional
• In 2012, I did not see any exceptional ratings
– Outstanding
– Very strong
– Strong
– Moderate
– Insufficient
42. 42/89
Criteria and Indicators
NSERC rating form
– NSERC provides a 2-page rating form
• In 2012, I found that this rating form does not follow
the presentation pattern during the competition slot
because it spreads information
• In 2012, however, each application was obviously
rated according to the 4 criteria and 6 indicators
43. 43/89
Criteria and Indicators
NSERC rating form (1/2)
Applicant: University: Application I.D.:
Applicant Status:
Title of Application:
Evaluation criteria (See Section 6 of Peer Review Manual for complete details)
Excellence of researcher(s)
Exceptional Outstanding Very Strong
Strong oderate Insufficient
Knowledge, expertise and experience
Quality of contributions to, and impact on, research areas in the NSE
Importance of contributions
For Team applications: complementarity of expertise between members and synergy
Rationale for rating:
Merit of the proposal
Exceptional Outstanding Very Strong
Strong Moderate Insufficient
Originality and innovation
Significance and expected contributions to research; potential for technological impact
Clarity and scope of objectives
Methodology and feasibility
Extent to which the scope of the proposal addresses all relevant issues
Appropriateness of, and justification for, the budget
Relationship to other sources of funds
Rationale for rating:
Contributions to training of highly qualified personnel
Exceptional Outstanding Very Strong
Strong Moderate Insufficient
Quality and impact of contributions during the last six years
Appropriateness of the proposal for the training of HQP in the NSE
Enhancement of training arising from a collaborative or interdisciplinary environment,
where applicable
Rationale for rating:
Cost of research (relative cost of the proposed research program as compared to the norms for the field) Low Normal High
Rationale for Cost of Research:
44. 44/89
Criteria and Indicators
NSERC rating form (2/2)
Other comments (e.g., duration should exceptionally be less than norm, special circumstances, quality of samples of
contributions provided, environmental impact, ethical concerns. Your Program Officer should be notified accordingly):
Summary of assessment by external referees (please highlight any comments that would be deemed inappropriate for the
Evaluation Group to consider in their discussions):
Points for message to applicant (if rating of “Moderate” or “Insufficient” on any criterion or duration shorter than norm):
Discovery Accelerator Supplement (DAS):
Regular DAS: Yes No
DAS in Targeted Areas : Yes No
Rationale for DAS Recommendation:
45. 45/89
Criteria and Indicators
My own rating form
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
46. 46/89
Criteria and Indicators
My own rating form
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
Researcher
47. 47/89
Criteria and Indicators
My own rating form
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
Researcher
Proposal
48. 48/89
Criteria and Indicators
My own rating form
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
Researcher
Proposal
HPQ
49. 49/89
Outline
Process in a Nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
Conclusion
Further readings
50. 50/89
Advices
Introduction
– Reviewers receive 2-3 dozens of applications
– Overall, upon firs review, the quality is
impressive, thus generating a positive reaction
– The objective is to discriminate, however,
initiating a vigorous search for flaws
51. 51/89
Advices
Introduction
– Reviewers may perceive aspects of applications
as confusing, ambiguous, incomplete, or just not
compelling
– They will not give the benefits of the doubt
• In 2012, I witness some excellent researchers
receiving low ratings because of sloppiness in
their applications
52. 52/89
Advices
Introduction
– Reviewers will most likely “mine” the Forms 100,
101, and publications to make up their minds
regarding the 4 criteria
Make it easy for them to mine your applications!
53. 53/89
Advices
Introduction
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
54. 54/89
Advices
Introduction
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
Form 100
55. 55/89
Advices
Introduction
Career 1: <Indicator>
Funding
Visibility
Publications
HPQ Record 3: <Indicator>
Planned training
Context 2: <Indicator>
Lacks
Incremental?
Applicable?
Feasible?
Summary of the
referees' reviews
Provided
contributions
Form 100
Form 101
56. 56/89
Advices
Introduction
– Form 100
• Is used for two of the three important criteria
– Form 101
• Is used for the merit of the proposal mostly
57. 57/89
Excellence of the Researcher
Form 100
– Career: pages 1-2
– Funding: pages 3-…
– Visibility
• “Other Evidence of Impact and Contributions”
• Awards, chairing, editorship, organisation, seminars:
anything showing external acknowledgments
– Publications
• “Other Research Contributions”
• Quantity and quality
58. 58/89
Excellence of the Researcher
Form 101
– Essentially nothing
Contributions
– Important for the experts, should be explained
for the non-experts in Form 100, “Most
Significant Contributions to Research”
External reviewers
– Confirm/contrast findings in the Form 100, 101,
and the publications
59. 59/89
Merit of the Proposal
Form 101
– Context
• Is the application well positioned?
– Lacks
• Any problems not discussed?
– Incremental?
• How innovative?
– Applicable?
• Usefulness, even remote?
– Feasible?
• Methodology
60. 60/89
Merit of the Proposal
Form 101
– Reviewers may also look for
• Knowledge of the key issues (background)
• Originality and innovation (background limits)
• Clarity of scope and objectives
• Methodology
– Trust/confidence that you can do work
• Significance
61. 61/89
Merit of the Proposal
Form 100
– Essentially nothing
Contributions
– Essentially nothing
External reviewers
– Confirm/contrast findings in the Form 101
62. 62/89
Contribution to the Training of HQP
Form 100
– HPQ Record
• Pages 1, 5-…
• Make it consistent, report what the students do now
– Planned training
• “Contributions to the Training of Highly Qualified
Personnel”
63. 63/89
Contribution to the Training of HQP
Form 101
– Last part on “Contribution to the Training of
Highly Qualified Personnel (HQP)”
Contributions
– Essentially nothing
External reviewers
– Confirm/contrast findings in the Form 100, 101
64. 64/89
Forms 100 and 101
In 2012, in general, my reading/rating was
made easier when the application carefully
followed the NSERC suggested
guidelines/templates
– Any missing/misplace/different category was
disruptive and sent a bad signal: “I want to do
differently from others”
65. 65/89
Form 100
In 2012, here is what I particularly looked at
“TRAINING OF HIGHLY QUALIFIED PERSONNEL”
– Total numbers of students to know if the
candidate is actively supervising students
– Any increase/decrease
– Any inflation of the numbers just before the
submission, it sends a bad message
66. 66/89
Form 100
In 2012, here is what I particularly looked at
“ACADEMIC, RESEARCH AND INDUSTRIAL EXPERIENCE”
– Current and past positions
– Any industrial experience
– Experiences that could explain the absence of
publications, of HQP
67. 67/89
Form 100
In 2012, here is what I particularly looked at
“RESEARCH SUPPORT”
– The candidate, given experience, having
appropriate funds?
• NSERC and industry are most important
• Others can help but should be explained, possibly in
the contributions (F100) or in the budget (F101)
• Amounts are not so important but give a signal
• They shows the candidate’s willingness, relevance
68. 68/89
Form 100
In 2012, here is what I particularly looked at
“HIGHLY QUALIFIED PERSONNEL (HQP)”
– The candidate’s contribution to the training of
past/current HQP
• Titles of the projects should be meaningful and
focused; unrelated titles send a bad signal
• “Present positions” are important and show that the
candidate follows his/her students
• The degree obtained/pursued by the HQP (Ph.D.?
M.Sc.? B.Sc.? Others?)
69. 69/89
Form 100
In 2012, here is what I particularly looked at
“MOST SIGNIFICANT CONTRIBUTIONS TO RESEARCH”
– Evidence of scientific contributions
• I ask myself “do I know this candidate?” If unknown, I
searched the places of publications to assess the
quality of the contributions
• References must better include where the papers
were published to ease the reviewer’s task
• The contributions should be related to the topic of the
current NSERC DG to show continuation
70. 70/89
Form 100
In 2012, here is what I particularly looked at
“ARTICLES IN REFEREED PUBLICATIONS”
– Evidence of scientific contributions
• Quality was first and foremost: I looked at the venues
and assessed their quality
– Acronyms must be explained, publisher names, years…
must be given, systematically
– Bibliographic metric values may be given; Google Scholar
citation counts is accepted; others metrics are discounted
• Quantity was a plus but, without quality, it sent a bad
signal: “I publish without focus”
71. 71/89
Form 100
In 2012, here is what I particularly looked at
“OTHER EVIDENCE OF IMPACT AND CONTRIBUTIONS”
– Evidence of external acknowledgments
• Anything that could help me confirm my impression
on the merit of the candidate: awards, chairing,
editorship, organisation, seminars
– Unknown venues just to “fill in” that part must be avoided
• Lack thereof was sending a bad signal: “I am not
involved in the community” or “The community does
not want me”
72. 72/89
Form 101
In 2012, here is what I particularly looked at
“TITLE OF THE PROPOSAL”
– The title, which must be relevant and accurate,
for a quick understanding (or not!) of what the
proposal is all about
73. 73/89
Form 101
In 2012, here is what I particularly looked at
“PROVIDE A MAXIMUM OF 10 KEY WORDS…”
– The keywords, which must be relevant and
accurate, to get an deeper idea of what the
proposal is all about
74. 74/89
Form 101
In 2012, here is what I particularly looked at
“TOTAL AMOUNT REQUESTED FROM NSERC”
– The total amount requested, which could raise
my curiosity if “too” high (say, more than 70 K)
or “too” low (say, less than 30 K)
• I would then read the budget to understand the
detailed numbers
75. 75/89
Form 101
In 2012, here is what I particularly looked at
“SUMMARY OF PROPOSAL FOR PUBLIC RELEASE”
– This very important part
• In my domain, would help me understand the
application and form an idea of what to expect
• Outside my domain, would help me understand the
application as a whole
• I would look for clear objectives, expected results and
their significance, and contributions to HQP
• Any typo or grammar error raised warning!
76. 76/89
Form 101
In 2012, here is what I particularly looked at
“TOTAL PROPOSED EXPENDITURES”
– The use of the money towards HQP
• “Most” of the money should go to fund HQP; I mean
at least 80%-90%
• But some money should be used for students’ travels
(not only for the candidate’s)
• I would also look for any “surprising” amounts in any
“surprising” category
77. 77/89
Form 101
In 2012, here is what I particularly looked at
“BUDGET JUSTIFICATION”
– The use of the money towards HQP
• It should include a table showing, per year, the use of the money
on students and possibly pictures of uncommon equipment
• Budget is not so important but errors or lack of justification sent
a bad signal
• Also, it is important to be consistent: 5 K are not enough to
equipped a whole lab. with computers…
• It should also explain any specific institutional rules or
complementary source of funds
• When naming conferences for travel, only name conferences
relevant to the proposal and possibly justify the choice
78. 78/89
Form 101
In 2012, here is what I particularly looked at
“RELATIONSHIP TO OTHER RESEARCH SUPPORT”
– Possible duplication of funds
• If there is a clear relation between two or more funds,
in that case, it should be clear how the money of the
NSERC DG will be used, e.g., students
79. 79/89
Form 101
In 2012, here is what I particularly looked at
“PROPOSAL”
– The context, objectives, expected results in the
first few lines or maybe an example to help me
understand what the proposal is all about
– A story
– Any missing/misplace/different category, which
was disruptive and sent a bad signal: “I want to
do differently from others”
80. 80/89
Form 101
In 2012, here is what I particularly looked at
“PROPOSAL”
– An up-to-date background
• Few recent references are interesting
• Some discussions of the references and of their
contributions and limitations is definitely needed
• If possible some clear, simple, and convincing
running examples illustrating the references limits
81. 81/89
Form 101
In 2012, here is what I particularly looked at
“PROPOSAL”
– The contrast with the background
• Then, the proposal should explain how it will improve
on the limits of the state-of-the-art or state-of-the-
practice and demonstrate its feasibility
• Then, it should also explain how it will go beyond
clear feasibility to show innovation
• It must be limited to 3 main objectives, with
milestones and students
82. 82/89
Form 101
In 2012, here is what I particularly looked at
“PROPOSAL”
– Clearly stated goal(s), challenge(s), expected
results and their significance, for each objective
• The proposal should not hide weaknesses and
challenges but explain them and the methodology to
lessen their impacts
• It should also explain how to evaluate the success/
failure of a goal and what happens “if…”
83. 83/89
Form 101
In 2012, here is what I particularly looked at
“PROPOSAL”
– The association of students with the objectives
• Possibly, the proposal should explain the recruiting of
students in the next section, “Contribution to HQP”
– Its possible significance
• On society, not on yourself or your small community
• Use of other, existing funds is “proof” of interest from
external, broader society
84. 84/89
Form 101
In 2012, here is what I particularly looked at
“CONTRIBUTION TO HQP”
– The context of training and its methodology
• It is a chance to explain how this application will
concretely be used to train students
• It is better if there are already some student names
• If the merit of the candidate is low and the candidate
boast too many students, it can be negative
85. 85/89
Form 101
In 2012, here is what I particularly looked at
“REFERENCES”
– Consistency
• References should be consistent and well-written,
again, any typo or grammar errors raised a warning!
86. 86/89
Outline
Process in a Nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
Conclusion
Further readings
87. 87/89
Conclusion
Follow the guidelines / templates carefully
Be simple, clear, straightforward
– Even with the weaknesses
Do not forget anything but do not show off
either, explain, explain, explain
Avoid at all costs typo / grammar errors
88. 88/89
Outline
Process in a Nutshell
– Candidate’s point of view
– Internal reviewer’s point of view
• Off-line work
• Competition week
Funding decisions
– In a nutshell
– Bins
– ER vs. ECR
Criteria and indicators
– “Values” of criteria
– “Meanings” of indicators
– NSERC rating form
– My own rating form
Advices
– Introduction
– Excellence of the researcher
– Merit of the proposal
– Contribution to the training HQP
– Form 100
– Form 101
Conclusion
Further readings
89. 89/89
Further Readings
NSERC Discovery Grants – An Insiders View by
Larry W. Kostiuk (Co-Chair, Fluids GSC 1512)
How to Succeed in the New NSERC Discovery
Grant Competition Model by Evangelos Milios, Nur
Zincir-Heywood, and Stavros Konstantinidis
NSERC Discovery Grant Applications: Hints and
Insights by Jason P. Leboe-McGowan
Advice on NSERC Discovery and RTI Applications
by Robert Bridson