{"id":51,"date":"2024-10-03T07:21:13","date_gmt":"2024-10-03T07:21:13","guid":{"rendered":"https:\/\/www.premiasg.org\/web\/premia-best-student-paper-awards-2021\/"},"modified":"2024-10-09T15:36:48","modified_gmt":"2024-10-09T15:36:48","slug":"premia-best-student-paper-awards-2021","status":"publish","type":"page","link":"https:\/\/www.premiasg.org\/web\/premia-best-student-paper-awards-2021\/","title":{"rendered":"Best Paper Awards 2021"},"content":{"rendered":"<div id=\"page\">\n<div id=\"layout\" class=\"pageContentText\">\n<div id=\"layout-content\">\n<div id=\"columns\">\n<div id=\"content\" class=\"container content border-none\">\n<div id=\"content-top\" class=\"top\">\n<div>\n<div><span style=\"font-size: 20px; font-weight: bold;\">PREMIA Best Student Paper Awards 2021<\/span><\/div>\n<\/div>\n<\/div>\n<div id=\"content-side\" class=\"side\">\n<div id=\"content-side2\" class=\"side2\">\n<div class=\"container-content\">\n<div id=\"content-content\">\n<div id=\"content-content-inner\" class=\"container-content-inner\">\n<div id=\"widget-e75fc68e-ad67-37db-1f59-2743cbf6028e\" class=\"widget widget-pagecontent\">\n<div class=\"widget-content\">\n<div id=\"widget-12875cc5-b807-4600-b52b-f8cab04bbfa5\" class=\"widget widget-text\">\n<div class=\"widget-content\">\n<p><span style=\"font-weight: bold;\">\u00a0<\/span><\/p>\n<p><strong>Winners of PREMIA Best Student Paper Awards 2021<\/strong><\/p>\n<p><strong>Gold Award<\/strong><\/p>\n<ul>\n<li><em>Practical Federated Gradient Boosting Decision Trees<\/em><br \/>\n<strong>Qinbin Li<\/strong> (AAAI 2020)<\/li>\n<\/ul>\n<p><strong>Silver Awards<\/strong><\/p>\n<ul>\n<li><em>Long-Tailed Classification by Keeping the Good and Removing the Bad Momentum Causal Effect<\/em><br \/>\n<strong>TANG Kaihua<\/strong> (NeurIPS 2020)<\/li>\n<li><em>Panet: Few-shot Image Semantic Segmentation with Prototype Alignment<\/em><br \/>\n<strong>Kaixin WANG<\/strong> (ICCV 2019)<\/li>\n<\/ul>\n<p><strong>Honourable Mention Awards<\/strong><\/p>\n<ul>\n<li><em>Block-wise Recursive Moore-Penrose Inverse for Network Learning<\/em><br \/>\n<strong>Huiping Zhuang<\/strong> (IEEE Transactions on Systems, Man, and Cybernetics)<\/li>\n<li><em>Event-Driven Visual-Tactile Sensing and Learning for Robots<\/em><br \/>\n<strong>Tasbolat Taunyazov<\/strong> (RSS 2020)<\/li>\n<li><em>Hierarchical Reinforcement Learning: A Comprehensive Survey<\/em><br \/>\n<strong>Shubham Pateria<\/strong> (ACM Computing Survey 2021)<\/li>\n<\/ul>\n<p><strong>Best Presentations<\/strong> ($50)<\/p>\n<ul>\n<li><em>Supervised Autoencoder Joint Learning on Heterogeneous Tactile Sensory Data: Improving Material Classification Performance<\/em><br \/>\n<strong>Ruihan Gao<\/strong> (IROS 2020)<\/li>\n<li><em>LOANT: Latent-Optimized Adversarial Neural Transfer for Sarcasm Detection<\/em><br \/>\n<strong>Xu Guo<\/strong> (NAACL-HLT 2021)<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: bold;\">\u00a0<\/span><\/p>\n<p><span class=\"image-block caption-over-image\" style=\"margin-right: auto; margin-left: auto; display: block; width: 800px;\"><br \/>\n<img fetchpriority=\"high\" decoding=\"async\" id=\"mce-7603\" src=\"..\/..\/..\/attachments\/Image\/Picture1.png?template=generic\" alt=\"\" width=\"800\" height=\"471\" \/><br \/>\n<\/span><\/p>\n<div class=\"flex-1 overflow-hidden\">\n<div class=\"h-full\">\n<div class=\"react-scroll-to-bottom--css-ccwvn-79elbk h-full\">\n<div class=\"react-scroll-to-bottom--css-ccwvn-1n7m0yu\">\n<div class=\"flex flex-col text-sm md:pb-9\">\n<article class=\"w-full text-token-text-primary focus-visible:outline-2 focus-visible:outline-offset-[-4px]\" dir=\"auto\" data-testid=\"conversation-turn-5\" data-scroll-anchor=\"true\">\n<div class=\"m-auto text-base py-[18px] px-3 md:px-4 w-full md:px-5 lg:px-4 xl:px-5\">\n<div class=\"mx-auto flex flex-1 gap-4 text-base md:gap-5 lg:gap-6 md:max-w-3xl lg:max-w-[40rem] xl:max-w-[48rem]\">\n<div class=\"group\/conversation-turn relative flex w-full min-w-0 flex-col agent-turn\">\n<div class=\"flex-col gap-1 md:gap-3\">\n<div class=\"flex max-w-full flex-col flex-grow\">\n<div class=\"min-h-8 text-message flex w-full flex-col items-end gap-2 whitespace-normal break-words [.text-message+&amp;]:mt-5\" dir=\"auto\" data-message-author-role=\"assistant\" data-message-id=\"27b4b12f-3cdc-4052-86f7-3de23f13ce0e\" data-message-model-slug=\"gpt-4o\">\n<div class=\"flex w-full flex-col gap-1 empty:hidden first:pt-[3px]\">\n<div class=\"markdown prose w-full break-words dark:prose-invert light\">\n<p><strong>Call for Submission\/Nomination<\/strong><\/p>\n<p>The Pattern Recognition and Machine Intelligence Association (PREMIA) invites students to participate in the PREMIA Best Student Paper Awards for papers (including journal and conference papers) accepted or published between <strong>1 July 2019 and 29 March 2021<\/strong>.<\/p>\n<p>Submissions can be made by the student author, with support from their supervisor, or directly nominated by the student&#8217;s supervisor.<\/p>\n<p><strong>Submission Deadline:<\/strong> 29 March 2021<\/p>\n<p>The nominated papers will be evaluated by a review panel authorized by the PREMIA board. Shortlisted candidates will be invited to present their papers at PREMIA\u2019s Members\u2019 Night, which will be held on <strong>12 April 2021<\/strong>. The awards will be determined following the presentations.<\/p>\n<p><strong>Award Categories:<\/strong><\/p>\n<ul>\n<li><strong>Gold Award:<\/strong> Cash prize of S$500<\/li>\n<li><strong>Silver Awards:<\/strong> Cash prize of S$200<\/li>\n<li><strong>Honourable Mention Awards:<\/strong> Cash prize of S$100<\/li>\n<\/ul>\n<p><strong>Eligibility Criteria:<\/strong><\/p>\n<ol>\n<li>The main author of the paper must be a PREMIA member at the time of the event. Applicants can join PREMIA at the time of submission to be eligible for this award. For membership registration or renewal, visit <a href=\"http:\/\/www.premiasg.org\/for-members\/membership\/\" target=\"_new\" rel=\"noopener\">PREMIA Membership<\/a>.<\/li>\n<li>The main author must be a registered student (full-time or part-time) at a tertiary education institution in Singapore at the time the paper was submitted for publication.<\/li>\n<li>The paper must report work done while the author was a student in Singapore.<\/li>\n<li>The paper must be accepted or published between <strong>1 July 2019 and 29 March 2021<\/strong>, both dates inclusive, in a journal or a conference.<\/li>\n<li>The paper must address a topic relevant to PREMIA. Topics may include pattern recognition, machine intelligence, computer vision, image processing, speech analysis, robotics, and more. Relevant methods include statistical techniques, neural networks, deep learning, fuzzy logic, evolutionary programming, etc.<\/li>\n<li>Shortlisted candidates are required to attend the online Members\u2019 Night event and present their papers.<\/li>\n<li>Only one award will be given per author, even if multiple papers are submitted.<\/li>\n<\/ol>\n<p><strong>Submission Instructions:<\/strong><\/p>\n<ol>\n<li>Complete the PREMIA Best Student Paper Award application entry form (available for download) and attach the following documents:\n<ul>\n<li>A PDF copy of the accepted or published paper.<\/li>\n<li>A PDF copy of the acceptance notification letter\/email from the journal editor or conference organizers.<\/li>\n<li>The supervisor\u2019s supporting or acknowledgment email.<\/li>\n<li>Optional supporting documents such as reviewer comments or a statement from the supervisor.<\/li>\n<\/ul>\n<\/li>\n<li>Submissions must be emailed as a single ZIP file containing all the required documents to PREMIA.<\/li>\n<li>For papers co-authored by several students, only one submission will be considered.<\/li>\n<\/ol>\n<p><strong>For inquiries or submissions, contact:<\/strong> Dr. Wang Wei<br \/>\nEmail: <a rel=\"noopener\">wangwei.cs@gmail.com<\/a><\/p>\n<p><strong>Submission Deadline:<\/strong> 29 March 2021<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/article>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div class=\"md:pt-0 dark:border-white\/20 md:border-transparent md:dark:border-transparent w-full\">\n<div>\n<div class=\"m-auto text-base px-3 md:px-4 w-full md:px-5 lg:px-4 xl:px-5\">\n<div class=\"mx-auto flex flex-1 gap-4 text-base md:gap-5 lg:gap-6 md:max-w-3xl lg:max-w-[40rem] xl:max-w-[48rem]\">\n<form class=\"w-full\" aria-haspopup=\"dialog\" aria-expanded=\"false\" aria-controls=\"radix-:r85:\" data-state=\"closed\">\n<div class=\"relative flex h-full max-w-full flex-1 flex-col\">\n<div class=\"group relative flex w-full items-center\">\n<div class=\"absolute bottom-16 space-y-2 z-20\"><\/div>\n<div class=\"flex w-full flex-col gap-1.5 rounded-[26px] p-1.5 transition-colors bg-[#f4f4f4] dark:bg-token-main-surface-secondary\">\n<div class=\"flex items-end gap-1.5 pl-4 md:gap-2\">\n<div class=\"-ml-2.5 flex\">\n<div class=\"relative\">\n<div class=\"relative\">\n<div class=\"flex flex-col\">\n<p><input class=\"hidden\" tabindex=\"-1\" multiple=\"multiple\" type=\"file\" \/><button id=\"radix-:r86:\" class=\"text-token-text-primary border border-transparent inline-flex items-center justify-center gap-1 rounded-lg text-sm dark:transparent dark:bg-transparent leading-none outline-none cursor-pointer hover:bg-token-main-surface-secondary dark:hover:bg-token-main-surface-secondary focus-visible:bg-token-main-surface-secondary radix-state-active:text-token-text-secondary radix-disabled:cursor-auto radix-disabled:bg-transparent radix-disabled:text-token-text-tertiary dark:radix-disabled:bg-transparent m-0 h-0 w-0 border-none bg-transparent p-0\" type=\"button\" aria-haspopup=\"menu\" aria-expanded=\"false\" data-state=\"closed\"><\/button><button class=\"flex items-center justify-center h-8 w-8 rounded-full text-token-text-primary dark:text-white focus-visible:outline-black dark:focus-visible:outline-white mb-1\" aria-disabled=\"false\" aria-label=\"Attach files\"><\/button><\/p>\n<div aria-haspopup=\"dialog\" aria-expanded=\"false\" aria-controls=\"radix-:r89:\" data-state=\"closed\"><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div class=\"flex min-w-0 flex-1 flex-col\">\n<div class=\"_prosemirror-parent_15ceg_1 text-token-text-primary max-h-[25dvh] max-h-52 overflow-auto default-browser\">\n<p><textarea class=\"block h-10 w-full resize-none border-0 bg-transparent px-0 py-2 text-token-text-primary placeholder:text-token-text-secondary\" placeholder=\"Message ChatGPT\"><\/textarea><script>requestAnimationFrame((function(){window.__oai_logTTI?window.__oai_logTTI():window.__oai_SSR_TTI=window.__oai_SSR_TTI??Date.now()}))<\/script><\/p>\n<div id=\"prompt-textarea\" class=\"ProseMirror\" contenteditable=\"true\" translate=\"no\"><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/form>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<p dir=\"ltr\"><span style=\"font-size: 18px; font-weight: bold;\">\u00a0<\/span><\/p>\n<p>&nbsp;<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<table class=\"widget-columns-table\">\n<tbody>\n<tr>\n<td class=\"widget-columns-column\" style=\"width: 48.768%;\"><\/td>\n<td class=\"widget-columns-column\" style=\"width: 51.232%;\"><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<table class=\"widget-columns-table\">\n<tbody>\n<tr>\n<td class=\"widget-columns-column\" style=\"width: 47.899%;\"><\/td>\n<td class=\"widget-columns-column\" style=\"width: 5%;\"><\/td>\n<td class=\"widget-columns-column\" style=\"width: 47.101%;\"><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div id=\"content-bottom\" class=\"bottom\">\n<div>\n<div><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div id=\"layout-footer\"><\/div>\n<\/div>\n<\/div>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>PREMIA Best Student Paper Awards 2021 \u00a0 Winners of PREMIA Best Student Paper Awards 2021 Gold Award Practical Federated Gradient Boosting Decision Trees Qinbin Li (AAAI 2020) Silver Awards Long-Tailed Classification by Keeping the Good and Removing the Bad Momentum&#8230;<br \/><a class=\"read-more-button\" href=\"https:\/\/www.premiasg.org\/web\/premia-best-student-paper-awards-2021\/\">Read more<\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-51","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/www.premiasg.org\/web\/wp-json\/wp\/v2\/pages\/51","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.premiasg.org\/web\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.premiasg.org\/web\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.premiasg.org\/web\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.premiasg.org\/web\/wp-json\/wp\/v2\/comments?post=51"}],"version-history":[{"count":5,"href":"https:\/\/www.premiasg.org\/web\/wp-json\/wp\/v2\/pages\/51\/revisions"}],"predecessor-version":[{"id":251,"href":"https:\/\/www.premiasg.org\/web\/wp-json\/wp\/v2\/pages\/51\/revisions\/251"}],"wp:attachment":[{"href":"https:\/\/www.premiasg.org\/web\/wp-json\/wp\/v2\/media?parent=51"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}