TY - GEN
T1 - Attention Layer on Hybrid Transformer Based Model for Legal Entity Recognition in Court Decision Documents
AU - Dina, Aghnia Bella
AU - Purwitasari, Diana
AU - Sholikah, Rizka
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Court decision documents contain complex and structured legal information, including critical entities such as the names of litigating parties, dates, and legal references. However, the formal and rigid language used in these documents often makes it difficult for the general public to understand their content and for legal practitioners to efficiently identify key elements. With the implementation of Legal Entity Recognition (LER), judges, legal professionals, and the public can more easily recognize and extract important legal entities from court decisions, thereby accelerating legal analysis, improving access to legal information, and enabling more effective decision-making. To address this challenge, a Legal Entity Recognition (LER) model is introduced by integrating a base transformer with an Attention Layer, Bidirectional Long Short-Term Memory (BiLSTM), and Conditional Random Field (CRF) to produce more accurate contextual representations of legal entities. The Attention Layer plays a critical role in improving the model's ability to focus on relevant keywords and context-dependent, allowing it to identify entities that are often difficult to detect with standard approaches. Evaluation results show that our proposed model achieves an F1 score of 85%, outperforming baseline models such as BiLSTM, BiLSTM-CRF, and basic Transformer architectures. These findings demonstrate the effectiveness of the model in understanding legal language and its potential support automated entity extraction within the legal domain.
AB - Court decision documents contain complex and structured legal information, including critical entities such as the names of litigating parties, dates, and legal references. However, the formal and rigid language used in these documents often makes it difficult for the general public to understand their content and for legal practitioners to efficiently identify key elements. With the implementation of Legal Entity Recognition (LER), judges, legal professionals, and the public can more easily recognize and extract important legal entities from court decisions, thereby accelerating legal analysis, improving access to legal information, and enabling more effective decision-making. To address this challenge, a Legal Entity Recognition (LER) model is introduced by integrating a base transformer with an Attention Layer, Bidirectional Long Short-Term Memory (BiLSTM), and Conditional Random Field (CRF) to produce more accurate contextual representations of legal entities. The Attention Layer plays a critical role in improving the model's ability to focus on relevant keywords and context-dependent, allowing it to identify entities that are often difficult to detect with standard approaches. Evaluation results show that our proposed model achieves an F1 score of 85%, outperforming baseline models such as BiLSTM, BiLSTM-CRF, and basic Transformer architectures. These findings demonstrate the effectiveness of the model in understanding legal language and its potential support automated entity extraction within the legal domain.
KW - Attention Layer
KW - BiLSTM
KW - CRF
KW - Legal Entity Recognition
KW - Transformer
UR - https://www.scopus.com/pages/publications/105025402064
U2 - 10.1109/AIMS66189.2025.11229449
DO - 10.1109/AIMS66189.2025.11229449
M3 - Conference contribution
AN - SCOPUS:105025402064
T3 - 2025 IEEE International Conference on Artificial Intelligence and Mechatronics Systems, AIMS 2025
BT - 2025 IEEE International Conference on Artificial Intelligence and Mechatronics Systems, AIMS 2025
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 3rd IEEE International Conference on Artificial Intelligence and Mechatronics Systems, AIMS 2025
Y2 - 24 May 2025 through 25 May 2025
ER -