[1]
|
CECCARELLI F, FERRUCCI L, LONDEI F, et al. Static and dynamic coding in distinct cell types during associative learning in the prefrontal cortex. Nature Communications, 2023, 14(1): 8325. doi: 10.1038/s41467-023-43712-2
|
[2]
|
WATAKABE A, SKIBBE H, NAKAE K, et al. Local and long-distance organization of prefrontal cortex circuits in the marmoset brain [J]. Neuron, 2023, 111 (14): 2258-73. e10.
|
[3]
|
PASSINGHAM R E, LAU H. Do we understand the prefrontal cortex?. Brain Structure and Function, 2023, 228(5): 1095−105.
|
[4]
|
TRAPP N T, BRUSS J E, MANZEL K, et al. Large-scale lesion symptom mapping of depression identifies brain regions for risk and resilience. Brain, 2023, 146(4): 1672−85. doi: 10.1093/brain/awac361
|
[5]
|
CHAFEE M V, HEILBRONNER S R. Prefrontal cortex. Current Biology, 2022, 32(8): R346−R51. doi: 10.1016/j.cub.2022.02.071
|
[6]
|
MILLER E K, COHEN J D. An integrative theory of prefrontal cortex function. Annual Review Of Neuroscience, 2001, 24: 167−202. doi: 10.1146/annurev.neuro.24.1.167
|
[7]
|
DIEHL G W, REDISH A D. Differential processing of decision information in subregions of rodent medial prefrontal cortex [J]. Elife, 2023, 12 .
|
[8]
|
WANG J X, KURTH-NELSON Z, KUMARAN D, et al. Prefrontal cortex as a meta-reinforcement learning system. Nature Neuroscience, 2018, 21(6): 860−8. doi: 10.1038/s41593-018-0147-8
|
[9]
|
ALEXANDER W H, BROWN J W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 2011, 14(10): 1338−44. doi: 10.1038/nn.2921
|
[10]
|
RIKHYE R V, GILRA A, HALASSA M M. Thalamic regulation of switching between cortical representations enables cognitive flexibility. Nature Neuroscience, 2018, 21(12): 1753−63. doi: 10.1038/s41593-018-0269-z
|
[11]
|
GISIGER T, BOUKADOUM M. Mechanisms Gating the Flow of Information in the Cortex: What They Might Look Like and What Their Uses may be. Frontiers In Computational Neuroscience, 2011, 5: 1.
|
[12]
|
JOHNSTON K, LEVIN H M, KOVAL M J, EVERLING S. Top-down control-signal dynamics in anterior cingulate and prefrontal cortex neurons following task switching. Neuron, 2007, 53(3): 453−62. doi: 10.1016/j.neuron.2006.12.023
|
[13]
|
TSUDA B, TYE K M, SIEGELMANN H T, SEJNOWSKI T J. A modeling framework for adaptive lifelong learning with transfer and savings through gating in the prefrontal cortex. Proceedings Of The National Academy Of Sciences, 2020, 117(47): 29872−82. doi: 10.1073/pnas.2009591117
|
[14]
|
WANG Z, ZHANG J, ZHANG X, et al. Transformer model for functional near-infrared spectroscopy classification. IEEE Journal Of Biomedical And Health Informatics, 2022, 26(6): 2559−69. doi: 10.1109/JBHI.2022.3140531
|
[15]
|
CHOI S R, LEE M. Transformer architecture and attention mechanisms in genome data analysis: a comprehensive review. Biology, 2023, 12(7): 1033. doi: 10.3390/biology12071033
|
[16]
|
LI Q, ZHUANG Y. An efficient image-guided-based 3D point cloud moving object segmentation with transformer-attention in autonomous driving. International Journal Of Applied Earth Observation And Geoinformation, 2023, 123: 103488. doi: 10.1016/j.jag.2023.103488
|
[17]
|
BRUS J, HENG J A, BELIAEVA V, et al. Causal phase-dependent control of non-spatial attention in human prefrontal cortex. Nature Human Behaviour, 2024, 8(4): 743−57. doi: 10.1038/s41562-024-01820-z
|
[18]
|
BICHOT N P, HEARD M T, DEGENNARO E M, DESIMONE R. A Source for Feature-Based Attention in the Prefrontal Cortex. Neuron, 2015, 88(4): 832−44. doi: 10.1016/j.neuron.2015.10.001
|
[19]
|
HUANG L, WANG J, HE Q, et al. A source for category-induced global effects of feature-based attention in human prefrontal cortex. Cell Reports, 2023, 42(9): 113080. doi: 10.1016/j.celrep.2023.113080
|
[20]
|
ZHAO M, XU D, GAO T. From Cognition to Computation: A Comparative Review of Human Attention and Transformer Architectures [J]. arXiv preprint arXiv: 240701548, 2024.
|
[21]
|
KUMAR S, SUMERS T R, YAMAKOSHI T, et al. Shared functional specialization in transformer-based language models and the human brain. Nature Communications, 2024, 15(1): 5523. doi: 10.1038/s41467-024-49173-5
|
[22]
|
MULLER L, CHURCHLAND P S, SEJNOWSKI T J. Transformers and Cortical Waves: Encoders for Pulling In Context Across Time [J]. arXiv preprint arXiv: 240114267, 2024.
|
[23]
|
HUANG H, LI R, QIAO X, et al. Attentional control influence habituation through modulation of connectivity patterns within the prefrontal cortex: Insights from stereo-EEG. Neuroimage, 2024, 294: 120640. doi: 10.1016/j.neuroimage.2024.120640
|
[24]
|
LI N, CHEN Y, LI W, et al. BViT: Broad attention-based vision transformer [J]. IEEE Transactions On Neural Networks And Learning Systems, 2023.
|
[25]
|
SHI Q, FAN J, WANG Z, ZHANG Z. Multimodal channel-wise attention transformer inspired by multisensory integration mechanisms of the brain. Pattern Recognition, 2022, 130: 108837. doi: 10.1016/j.patcog.2022.108837
|
[26]
|
GONG D, ZHANG H. Self-Attention Limits Working Memory Capacity of Transformer-Based Models [J]. arXiv preprint arXiv: 240910715, 2024.
|
[27]
|
MAITH O, SCHWARZ A, HAMKER F H. Optimal attention tuning in a neuro-computational model of the visual cortex-basal ganglia-prefrontal cortex loop. Neural Networks, 2021, 142: 534−47. doi: 10.1016/j.neunet.2021.07.008
|
[28]
|
SPITALE G, BILLER-ANDORNO N, GERMANI F. AI model GPT-3 (dis) informs us better than humans. Science Advances, 2023, 9(26): eadh1850. doi: 10.1126/sciadv.adh1850
|
[29]
|
BROWN T, MANN B, RYDER N, et al. Language models are few-shot learners. Advances In Neural Information Processing Systems, 2020, 33: 1877−901.
|
[30]
|
ZHANG S, ROLLER S, GOYAL N, et al. Opt: Open pre-trained transformer language models [J]. arXiv preprint arXiv: 220501068, 2022.
|
[31]
|
YUE F, KO T. An Investigation of Positional Encoding in Transformer-based End-to-end Speech Recognition [Z]. In: Proceedings of the 12th International Symposium on Chinese Spoken Language Processing (ISCSLP). 2021: 1-5.10. 1109/iscslp49672.2021. 9362093
|
[32]
|
KAZEMNEJAD A, PADHI I, NATESAN RAMAMURTHY K, et al. The impact of positional encoding on length generalization in transformers [J]. Advances In Neural Information Processing Systems, 2024, 36 .
|
[33]
|
CHOWDHERY A, NARANG S, DEVLIN J, et al. Palm: Scaling language modeling with pathways. Journal Of Machine Learning Research, 2023, 24(240): 1−113.
|
[34]
|
ZHANG R, HAN J, LIU C, et al. Llama-adapter: Efficient fine-tuning of language models with zero-init attention [J]. arXiv preprint arXiv: 230316199, 2023.
|
[35]
|
TOUVRON H, LAVRIL T, IZACARD G, et al. Llama: Open and efficient foundation language models [J]. arXiv preprint arXiv: 230213971, 2023.
|
[36]
|
WU J, ZHANG R, MAO Y, CHEN J. On scalar embedding of relative positions in attention models; In: Proceedings of the AAAI Conference on Artificial Intelligence, F, 2021, 35(16): 14050-14057 [C].
|
[37]
|
WALLIS J D, ANDERSON K C, MILLER E K. Single neurons in prefrontal cortex encode abstract rules. Nature, 2001, 411(6840): 953−6. doi: 10.1038/35082081
|
[38]
|
OOTA S R, ARORA J, ROWTULA V, et al. Visio-linguistic brain encoding [J]. arXiv preprint arXiv: 220408261, 2022.
|
[39]
|
BOCINCOVA A, BUSCHMAN T J, STOKES M G, MANOHAR S G. Neural signature of flexible coding in prefrontal cortex. Proceedings Of The National Academy Of Science USA, 2022, 119(40): e2200400119. doi: 10.1073/pnas.2200400119
|
[40]
|
ZHANG K, HAO W, YU X, SHAO T. An interpretable image classification model Combining a fuzzy neural network with a variational autoencoder inspired by the human brain. Information Sciences, 2024, 661: 119885. doi: 10.1016/j.ins.2023.119885
|
[41]
|
AOI M C, MANTE V, PILLOW J W. Prefrontal cortex exhibits multidimensional dynamic encoding during decision-making. Nature Neuroscience, 2020, 23(11): 1410−20. doi: 10.1038/s41593-020-0696-5
|
[42]
|
ZHANG Z, GONG X. Positional label for self-supervised vision transformer; In: Proceedings of the AAAI Conference on Artificial Intelligence, F, 2023, 37(3): 3516-3524 [C].
|
[43]
|
CAUCHETEUX C, GRAMFORT A, KING J R. Evidence of a predictive coding hierarchy in the human brain listening to speech. Nature Human Behavaviour, 2023, 7(3): 430−41. doi: 10.1038/s41562-022-01516-2
|
[44]
|
BUSCH A, ROUSSY M, LUNA R, et al. Neuronal activation sequences in lateral prefrontal cortex encode visuospatial working memory during virtual navigation. Nature Communications, 2024, 15(1): 4471. doi: 10.1038/s41467-024-48664-9
|
[45]
|
LABAIEN J, IDé T, CHEN P-Y, et al. Diagnostic spatio-temporal transformer with faithful encoding [J]. Knowledge-Based Systems, 2023, 274 .
|
[46]
|
DEIHIM A, ALONSO E, APOSTOLOPOULOU D. STTRE: A Spatio-Temporal Transformer with Relative Embeddings for multivariate time series forecasting. Neural Networks, 2023, 168: 549−59. doi: 10.1016/j.neunet.2023.09.039
|
[47]
|
MA Y, WANG R. Relative-position embedding based spatially and temporally decoupled Transformer for action recognition [J]. Pattern Recognition, 2024, 145 .
|
[48]
|
COEN P, SIT T P H, WELLS M J, et al. Mouse frontal cortex mediates additive multisensory decisions [J]. Neuron, 2023, 111 (15): 2432-47 e13.
|
[49]
|
FERRARI A, NOPPENEY U. Attention controls multisensory perception via two distinct mechanisms at different levels of the cortical hierarchy. PLoS Biology, 2021, 19(11): e3001465. doi: 10.1371/journal.pbio.3001465
|
[50]
|
MIHALIK A, NOPPENEY U. Causal inference in audiovisual perception. Journal Of Neuroscience, 2020, 40(34): 6600−12. doi: 10.1523/JNEUROSCI.0051-20.2020
|
[51]
|
KANG K, ROSENKRANZ R, KARAN K, et al. Congruence-based contextual plausibility modulates cortical activity during vibrotactile perception in virtual multisensory environments. Communications Biology, 2022, 5(1): 1360. doi: 10.1038/s42003-022-04318-4
|
[52]
|
CAO Y, SUMMERFIELD C, PARK H, et al. Causal inference in the multisensory brain [J]. Neuron, 2019, 102 (5): 1076-87. e8.
|
[53]
|
GIESSING C, THIEL C M, STEPHAN K E, et al. Visuospatial attention: how to measure effects of infrequent, unattended events in a blocked stimulus design. Neuroimage, 2004, 23(4): 1370−81. doi: 10.1016/j.neuroimage.2004.08.008
|
[54]
|
ZHENG Q, ZHOU L, GU Y. Temporal synchrony effects of optic flow and vestibular inputs on multisensory heading perception [J]. Cell Reports, 2021, 37 (7).
|
[55]
|
LIANG P P, ZADEH A, MORENCY L-P. Foundations & Trends in Multimodal Machine Learning: Principles, Challenges, and Open Questions. ACM Computing Surveys, 2024, 56(10): 1−42.
|
[56]
|
KLEMEN J, CHAMBERS C D. Current perspectives and methods in studying neural mechanisms of multisensory interactions. Neuroscience & Biobehavioral Reviews, 2012, 36(1): 111−33.
|
[57]
|
PARASKEVOPOULOS G, GEORGIOU E, POTAMIANOS A. Mmlatch: Bottom-Up Top-Down Fusion For Multimodal Sentiment Analysis [Z]. In: Proceedings of the 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 2022: 4573-7.10. 1109/icassp43922.2022. 9746418
|
[58]
|
SUN H, LIU J, CHEN Y-W, LIN L. Modality-invariant temporal representation learning for multimodal sentiment classification. Information Fusion, 2023, 91: 504−14. doi: 10.1016/j.inffus.2022.10.031
|
[59]
|
WANG Z, WAN Z, WAN X. Transmodality: An end2end fusion method with transformer for multimodal sentiment analysis; In: Proceedings of the web conference 2020, F, 2020: 2514-2520 [C].
|
[60]
|
YU J, CHEN K, XIA R. Hierarchical interactive multimodal transformer for aspect-based multimodal sentiment analysis. IEEE Transactions On Affective Computing, 2022, 14(3): 1966−78.
|
[61]
|
HUANG J, ZHOU J, TANG Z, et al. TMBL: Transformer-based multimodal binding learning model for multimodal sentiment analysis [J]. Knowledge-Based Systems, 2024, 285 .
|
[62]
|
YANG D, LIU Y, HUANG C, et al. Target and source modality co-reinforcement for emotion understanding from asynchronous multimodal sequences [J]. Knowledge-Based Systems, 2023, 265 .
|
[63]
|
AHN H J, LEE D H, JEONG J H, LEE S W. Multiscale Convolutional Transformer for EEG Classification of Mental Imagery in Different Modalities [J]. IEEE Transactions On Neural Systems And Rehabilitation Engineering, 2022, 31 : 646-656. PP.
|
[64]
|
LI J, CHEN N, ZHU H, et al. Incongruity-aware multimodal physiology signals fusion for emotion recognition [J]. Information Fusion, 2024, 105 .
|
[65]
|
ASIF M, GUPTA A, ADITYA A, et al. Brain Multi-Region Information Fusion using Attentional Transformer for EEG Based Affective Computing [Z]. In: Proceedings of the 20th India Council International Conference (INDICON). 2023: 771-5.10. 1109/indicon59947.2023. 10440791
|
[66]
|
CHEN Z, HAN Y, MA Z, et al. A prefrontal-thalamic circuit encodes social information for social recognition. Nature Communications, 2024, 15(1): 1036. doi: 10.1038/s41467-024-45376-y
|
[67]
|
YU J, LI J, YU Z, HUANG Q. Multimodal Transformer With Multi-View Visual Representation for Image Captioning. IEEE Transactions On Circuits And Systems For Video Technology, 2020, 30(12): 4467−80. doi: 10.1109/TCSVT.2019.2947482
|
[68]
|
HU B, GUAN Z-H, CHEN G, CHEN C P. Neuroscience and network dynamics toward brain-inspired intelligence. IEEE Transactions on Cybernetics, 2021, 52(10): 10214−27.
|
[69]
|
SUCHOLUTSKY I, MUTTENTHALER L, WELLER A, et al. Getting aligned on representational alignment [J]. arXiv preprint arXiv: 231013018, 2023.
|
[70]
|
CHERSONI E, SANTUS E, HUANG C-R, LENCI A. Decoding word embeddings with brain-based semantic features. Computational Linguistics, 2021, 47(3): 663−98. doi: 10.1162/coli_a_00412
|
[71]
|
TONEVA M, WEHBE L. Interpreting and improving natural-language processing (in machines) with natural language-processing (in the brain) [J]. Advances in Neural Information Processing Systems, 2019, 32 .
|
[72]
|
YU S, GU C, HUANG K, LI P. Predicting the next sentence (not word) in large language models: What model-brain alignment tells us about discourse comprehension. Science Advances, 2024, 10(21): eadn7744. doi: 10.1126/sciadv.adn7744
|
[73]
|
CAMBRIA E, DAS D, BANDYOPADHYAY S, FERACO A. Affective computing and sentiment analysis [J]. A Practical Guide To Sentiment Analysis, 2017: 1-10 .
|
[74]
|
MISHRA A, DEY K, BHATTACHARYYA P. Learning cognitive features from gaze data for sentiment and sarcasm classification using convolutional neural network; In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), F, 2017 [C].
|
[75]
|
ZHANG Z, WU C, CHEN H, CHEN H. CogAware: Cognition-Aware framework for sentiment analysis with textual representations [J]. Knowledge-Based Systems, 2024, 299 .
|
[76]
|
MONTEJO-RáEZ A, MOLINA-GONZáLEZ M D, JIMéNEZ-ZAFRA S M, et al. A survey on detecting mental disorders with natural language processing: Literature review, trends and challenges. Computer Science Review, 2024, 53: 100654. doi: 10.1016/j.cosrev.2024.100654
|
[77]
|
RAMACHANDRAN G, YANG R. CortexCompile: Harnessing Cortical-Inspired Architectures for Enhanced Multi-Agent NLP Code Synthesis [J]. arXiv preprint arXiv: 240902938, 2024.
|
[78]
|
LI Z, ZHAO B, ZHANG G, DANG J. Brain network features differentiate intentions from different emotional expressions of the same text; In: Proceedings of the 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), F, 2023 [C]. IEEE.
|
[79]
|
SQUIRES M, TAO X, ELANGOVAN S, et al. Deep learning and machine learning in psychiatry: a survey of current progress in depression detection, diagnosis and treatment. Brain Informatics, 2023, 10(1): 10. doi: 10.1186/s40708-023-00188-6
|
[80]
|
SONG G, HUANG D, XIAO Z. A study of multilingual toxic text detection approaches under imbalanced sample distribution. Information, 2021, 12(5): 205. doi: 10.3390/info12050205
|
[81]
|
DOSOVITSKIY A, BEYER L, KOLESNIKOV A, et al. An image is worth 16x16 words: Transformers for image recognition at scale [J]. arXiv preprint arXiv: 201011929, 2020.
|
[82]
|
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need [J]. Advances In Neural Information Processing Systems, 2017, 30 .
|
[83]
|
BI Y, ABROL A, JIA S, et al. Gray Matters: An Efficient Vision Transformer GAN Framework for Predicting Functional Network Connectivity Biomarkers from Brain Structure [J]. BioRxiv, 2024: 2024.01. 11.575307.
|
[84]
|
DONG S, GONG Y, SHI J, et al. Brain Cognition-Inspired Dual-Pathway CNN Architecture for Image Classification. IEEE Transactions On Neural Networks And Learning Systems, 2024, 35(7): 9900−14. doi: 10.1109/TNNLS.2023.3237962
|
[85]
|
LIU L, WANG F, ZHOU K, et al. Perceptual integration rapidly activates dorsal visual pathway to guide local processing in early visual areas. PLoS Biology, 2017, 15(11): e2003646. doi: 10.1371/journal.pbio.2003646
|
[86]
|
BAR M. The proactive brain: using analogies and associations to generate predictions. Trends In Cognitive Sciences, 2007, 11(7): 280−9. doi: 10.1016/j.tics.2007.05.005
|
[87]
|
BARAM A B, MULLER T H, NILI H, et al. Entorhinal and ventromedial prefrontal cortices abstract and generalize the structure of reinforcement learning problems [J]. Neuron, 2021, 109 (4): 713-23 e7.
|
[88]
|
VAN HOLSTEIN M, FLORESCO S B. Dissociable roles for the ventral and dorsal medial prefrontal cortex in cue-guided risk/reward decision making. Neuropsychopharmacology, 2020, 45(4): 683−93. doi: 10.1038/s41386-019-0557-7
|
[89]
|
AVERBECK B, O'DOHERTY J P. Reinforcement-learning in fronto-striatal circuits. Neuropsychopharmacology, 2022, 47(1): 147−62. doi: 10.1038/s41386-021-01108-0
|
[90]
|
HU S, SHEN L, ZHANG Y, et al. On Transforming Reinforcement Learning With Transformers: The Development Trajectory [J]. IEEE Transactions On Pattern Analysis And Machine Intelligence, 2024, PP.
|
[91]
|
ZHANG Y, JIA M, CHEN T, et al. A neuroergonomics model for evaluating nuclear power plants operators’ performance under heat stress driven by ECG time-frequency spectrums and fNIRS prefrontal cortex network: A CNN-GAT fusion model. Advanced Engineering Informatics, 2024, 62: 102563. doi: 10.1016/j.aei.2024.102563
|
[92]
|
LAW C-K, KOLLING N, CHAN C C, CHAU B K. Frontopolar cortex represents complex features and decision value during choice between environments [J]. Cell Reports, 2023, 42 (6).
|
[93]
|
LEE J, JUNG M, LUSTIG N, LEE J H. Neural representations of the perception of handwritten digits and visual objects from a convolutional neural network compared to humans. Human Brain Mapping, 2023, 44(5): 2018−38. doi: 10.1002/hbm.26189
|
[94]
|
VISWANATHAN K A, MYLAVARAPU G, CHEN K, THOMAS J P. A Study of Prefrontal Cortex Task Switching Using Spiking Neural Networks; In: Proceedings of the 12th International Conference on Advanced Computational Intelligence (ICACI), F, 2020 [C]. IEEE.
|
[95]
|
LI B-Z, PUN S H, FENG W, et al. A spiking neural network model mimicking the olfactory cortex for handwritten digit recognition; In: Proceedings of the 9th International IEEE/EMBS Conference on Neural Engineering (NER), F, 2019 [C]. IEEE.
|
[96]
|
HYAFIL A, SUMMERFIELD C, KOECHLIN E. Two mechanisms for task switching in the prefrontal cortex. Journal Of Neuroscience, 2009, 29(16): 5135−42. doi: 10.1523/JNEUROSCI.2828-08.2009
|
[97]
|
KUSHLEYEVA Y, SALVUCCI D D, LEE F J. Deciding when to switch tasks in time-critical multitasking. Cognitive Systems Research, 2005, 6(1): 41−9. doi: 10.1016/j.cogsys.2004.09.005
|
[98]
|
BRASS M, VON CRAMON D Y. The role of the frontal cortex in task preparation. Cerebral Cortex, 2002, 12(9): 908−14. doi: 10.1093/cercor/12.9.908
|
[99]
|
WEI Q, HAN L, ZHANG T. Learning and Controlling Multiscale Dynamics in Spiking Neural Networks Using Recursive Least Square Modifications [J]. IEEE Transactions On Cybernetics, 2024, PP.
|
[100]
|
DEMIR A, KOIKE-AKINO T, WANG Y, et al. EEG-GNN: Graph neural networks for classification of electroencephalogram (EEG) signals; proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), F, 2021 [C]. IEEE.
|
[101]
|
BALAJI S S, PARHI K K. Classifying Subjects with PFC Lesions from Healthy Controls during Working Memory Encoding via Graph Convolutional Networks [Z]. In: Proceedings of the 11th International IEEE/EMBS Conference on Neural Engineering (NER). 2023: 1-4.10. 1109/ner52421.2023. 10123793
|
[102]
|
YANG Y, YE C, MA T. A deep connectome learning network using graph convolution for connectome-disease association study. Neural Networks, 2023, 164: 91−104. doi: 10.1016/j.neunet.2023.04.025
|
[103]
|
ACHTERBERG J, AKARCA D, STROUSE D J, et al. Spatially embedded recurrent neural networks reveal widespread links between structural and functional neuroscience findings. Nature Machine Intelligence, 2023, 5(12): 1369−81. doi: 10.1038/s42256-023-00748-9
|
[104]
|
JENSEN K T, HENNEQUIN G, MATTAR M G. A recurrent network model of planning explains hippocampal replay and human behavior [J]. Nature Neuroscience, 2024: 1-9 .
|
[105]
|
PRATIWI M. Comparative Analysis of Brain Waves for EEG-Based Depression Detection in the Prefrontal Cortex Lobe using LSTM; In: Proceedings of the 7th International Conference on New Media Studies (CONMEDIA), F, 2023 [C]. IEEE.
|
[106]
|
PRATIWI M. EEG-Based Depression Detection in the Prefrontal Cortex Lobe using mRMR Feature Selection and Bidirectional LSTM. Ultima Computing: Jurnal Sistem Komputer, 2023, 15(2): 71−8. doi: 10.31937/sk.v15i2.3426
|
[107]
|
SHARMA S, SHARMA S, ATHAIYA A. Activation functions in neural networks. Towards Data Sci, 2017, 6(12): 310−6.
|
[108]
|
JAGTAP A D, KAWAGUCHI K, KARNIADAKIS G E. Adaptive activation functions accelerate convergence in deep and physics-informed neural networks. Journal Of Computational Physics, 2020, 404: 109136. doi: 10.1016/j.jcp.2019.109136
|
[109]
|
ABBASI J, ANDERSEN P Ø. Physical activation functions (PAFs): an approach for more efficient induction of physics into physics-informed neural networks (PINNs). Neurocomputing, 2024, 608: 128352. doi: 10.1016/j.neucom.2024.128352
|
[110]
|
JAGTAP A D, KARNIADAKIS G E. How important are activation functions in regression and classification? A survey, performance comparison, and future directions [J]. Journal Of Machine Learning For Modeling And Computing, 2023, 4 (1).
|
[111]
|
MANOLA L, ROELOFSEN B, HOLSHEIMER J, et al. Modelling motor cortex stimulation for chronic pain control: electrical potential field, activating functions and responses of simple nerve fibre models. Medical And Biological Engineering And Computing, 2005, 43: 335−43. doi: 10.1007/BF02345810
|
[112]
|
STEINERBERGER S, WU H-T. Fundamental component enhancement via adaptive nonlinear activation functions. Applied And Computational Harmonic Analysis, 2023, 63: 135−43. doi: 10.1016/j.acha.2022.11.007
|
[113]
|
PAPPAS C, KOVAIOS S, MORALIS-PEGIOS M, et al. Programmable tanh-, elu-, sigmoid-, and sin-based nonlinear activation functions for neuromorphic photonics [J]. IEEE Journal Of Selected Topics In Quantum Electronics, 2023, 29 (6: Photonic Signal Processing): 1-10.
|
[114]
|
HA D, SCHMIDHUBER J. World models [J]. arXiv preprint arXiv: 180310122, 2018.
|
[115]
|
ESLAMI S A, JIMENEZ REZENDE D, BESSE F, et al. Neural scene representation and rendering. Science, 2018, 360(6394): 1204−10. doi: 10.1126/science.aar6170
|
[116]
|
YAMINS D L, HONG H, CADIEU C F, et al. Performance-optimized hierarchical models predict neural responses in higher visual cortex. Proceedings Of The National Academy Of Sciences, 2014, 111(23): 8619−24. doi: 10.1073/pnas.1403112111
|
[117]
|
FRISTON K, MORAN R J, NAGAI Y, et al. World model learning and inference. Neural Networks, 2021, 144: 573−90. doi: 10.1016/j.neunet.2021.09.011
|
[118]
|
ROBINE J, HöFTMANN M, UELWER T, HARMELING S. Transformer-based world models are happy with 100k interactions [J]. arXiv preprint arXiv: 230307109, 2023.
|
[119]
|
MICHELI V, ALONSO E, FLEURET F. Transformers are sample-efficient world models [J]. arXiv preprint arXiv: 220900588, 2022.
|
[120]
|
CHEN C, WU Y-F, YOON J, AHN S. Transdreamer: Reinforcement learning with transformer world models [J]. arXiv preprint arXiv: 220209481, 2022.
|
[121]
|
ZHANG W, WANG G, SUN J, et al. STORM: Efficient stochastic transformer based world models for reinforcement learning [J]. Advances In Neural Information Processing Systems, 2024, 36 .
|
[122]
|
HAFNER D, PASUKONIS J, BA J, LILLICRAP T. Mastering diverse domains through world models [J]. arXiv preprint arXiv: 230104104, 2023.
|
[123]
|
HAFNER D, LILLICRAP T, NOROUZI M, BA J. Mastering atari with discrete world models [J]. arXiv preprint arXiv: 201002193, 2020.
|
[124]
|
BARTO A G, SUTTON R S, ANDERSON C W. Looking back on the actor–critic architecture. IEEE Transactions On Systems, Man, And Cybernetics: Systems, 2020, 51(1): 40−50.
|
[125]
|
KAISER L, BABAEIZADEH M, MILOS P, et al. Model-based reinforcement learning for atari [J]. arXiv preprint arXiv: 190300374, 2019.
|
[126]
|
MOERLAND T M, BROEKENS J, PLAAT A, JONKER C M. Model-based reinforcement learning: A survey. Foundations And Trends® In Machine Learning, 2023, 16(1): 1−118.
|
[127]
|
GU A, GOEL K, Ré C. Efficiently modeling long sequences with structured state spaces [J]. arXiv preprint arXiv: 211100396, 2021.
|
[128]
|
SMITH J T, WARRINGTON A, LINDERMAN S W. Simplified state space layers for sequence modeling [J]. arXiv preprint arXiv: 220804933, 2022.
|
[129]
|
DENG F, PARK J, AHN S. Facing off world model backbones: Rnns, transformers, and s4 [J]. Advances In Neural Information Processing Systems, 2024, 36 .
|
[130]
|
CHEN J, LI S E, TOMIZUKA M. Interpretable end-to-end urban autonomous driving with latent deep reinforcement learning. IEEE Transactions On Intelligent Transportation Systems, 2021, 23(6): 5068−78.
|
[131]
|
HAFNER D, LILLICRAP T, BA J, NOROUZI M. Dream to control: Learning behaviors by latent imagination [J]. arXiv preprint arXiv: 191201603, 2019.
|
[132]
|
ZHANG Y, MU Y, YANG Y, et al. Steadily learn to drive with virtual memory [J]. arXiv preprint arXiv: 210208072, 2021.
|
[133]
|
GAO Z, MU Y, CHEN C, et al. Enhance sample efficiency and robustness of end-to-end urban autonomous driving via semantic masked world model [J]. IEEE Transactions On Intelligent Transportation Systems, 2024.
|
[134]
|
YU N, LV Z, YAN J, WANG Z. Spatial Cognition and Decision Model Based on Hippocampus-Prefrontal Cortex Interaction [Z]. 2023 China Automation Congress (CAC). 2023: 3754-9.10. 1109/cac59555.2023. 10450650.
|