Conference timings and event flow.
Abstract and AI project submission routes.
Faculty, clinicians, and invited guests.
Find campus ambassadors and referral codes.
Student-led undergraduate research efforts at KEMU.
Policies, submissions, and registration.
Direct support for participants and teams.
NRIC 2026 accepts applied AI projects in healthcare, biomedical sciences, and medical education. Coding is not required for every track, but each submission must define a specific medical problem, a substantive AI pathway, and declared-track execution evidence. Submission alone does not guarantee presentation at the conference.
Each individual may submit one AI project through the conference profile portal. The submitter is treated as the presenting author.
Presenting author must be an undergraduate MBBS student. Technical co-authors are allowed, but all members must be declared at submission time.
All submissions undergo double-blinded review and screening for AI-generated scientific content and plagiarism. Required summary sections, declared-track evidence, and integrity disclosures are mandatory.
Coding is not required, but the submission must provide a fully specified workflow with clear inputs, outputs, AI logic, and a credible validation and deployment plan.
Narrative ideas without workflow specificity do not meet the Track A standard.
For no-code, low-code, or UI/UX builds. Submit functional evidence: clickable prototype, no-code build, or demo video up to 1 minute.
Static wireframes and screenshots without functional context do not meet the Track B standard.
For programming or ML teams. Submit a functional model or tested pipeline with documented performance metrics and a demo video up to 1 minute.
Untested or undocumented code does not meet the Track C standard.
AI projects enter the same The Framework governance pathway so routing, scoring, and committee adjudication follow a consistent scientific standard across the conference cycle.
Declared-track integrity, eligibility, required sections, and governance disclosures are validated before scoring begins.
Reviewers assess clinical value, AI methodology, execution evidence, and policy alignment using the scientific rubric.
The committee reconciles reviewer outputs, resolves variance, and confirms final poster communication for the programme.
Define a specific unmet clinical or healthcare workflow problem.
State who uses it, where, and under what constraints.
Explain why AI is appropriate compared with simpler tools.
Describe framework/architecture with technical specificity.
Specify data origin and precise system inputs/outputs.
Provide track-aligned evidence: blueprint, prototype, or metrics.
Tracks A/B require robust proposed validation; Track C requires empirical results.
Address workflow integration, scalability, and infrastructure.
Provide specific technical, clinical, and practical limitations.
Cover privacy, consent, bias, misuse risk, and human oversight; Track C requires an approved ethical statement where applicable.
Disclose originality and any generative AI tool usage accurately.
Peer review score out of 30 across six criteria (1-5 each), judged against your declared track standard.
Execution bonus out of 10: Track A (+0), Track B (+5), and Track C (+10), granted only when execution evidence is valid for the declared track.
If declared track and submitted evidence mismatch, the Scientific Committee can reclassify before scoring (for example, Track C with no functional model may be moved to Track A and lose execution bonus).
Summary body limit: 500 words including spaces.
One optional figure is allowed and does not count toward word limit.
Submit via NRIC Conference Profile Portal only; email submissions are not accepted.
Abbreviations are allowed in text only when defined at first mention; do not use abbreviations in title.
Accepted file format: Microsoft Word (.doc or .docx) only.
Title must reflect AI methodology and clinical application; no abbreviations in title.
Presenting author must be an undergraduate MBBS student; technical co-authors are permitted.
Presenting author does not need to be first-listed, but must be bold and underlined in the submission document.
All team members must be declared at submission; post-submission author list changes are not accepted.
Include all author names and institutional affiliations in the submitted document.
Use Times New Roman, size 12-14, with 1.5 line spacing.
Filename format: AIProject-[Track]-[FirstAuthorLastname]-[Shorttitle]-[PresentingAuthorLastname].
Presenting author must complete separate conference registration regardless of project acceptance status.
Language editing, literature discovery, and code scaffolding are allowed.
Intellectual content cannot be AI-generated in place of team reasoning. False or incomplete disclosures are treated as academic misconduct.
Detection of substantially AI-generated scientific content.
Plagiarism of text, methodology, or conceptual framing.
Near-duplicate submissions with substantially similar concepts across teams.
Missing mandatory submission components.
Falsified authorship, affiliation, or AI-use declarations.
Submission through unauthorised channels.
Inability to explain or justify the submission during live evaluation.
Deliberate misrepresentation of declared track versus execution evidence.
All projects are judged through NRIC’s Framework for Abstract Triage, Evaluation and Harmonisation (FATEH). Selected projects are assigned poster presentation by the Scientific Committee, and selected presenting authors are informed about poster printing fee instructions separately.
Late submissions are not accepted under any circumstances.
Approval from all co-authors is required, and presenting author responsibility applies to any conflicts.