TY - JOUR TI - CODE GENERATION USING TRANSFORMER BASED LANGUAGE MODEL AB - Machine Learning has attracted researchers in the last decades and has been applied to different problems in many fields. Deep Learning methods which is a subfield of Machine Learning have started to be utilized to solve complex and hard problems with the improvement of computer technologies. Natural language processing is one of the challenging tasks that still needs to be improved for different applications such as code generation. Recently, general-purpose transformer based autoregressive language models achieved promising results on natural language generation tasks. Code generation from natural utterance using deep learning methods could be a promising development in terms of decreasing mental effort and time spent. In this study, a layered approach to generate Cascading Styles Sheets rules is proposed. The abstract data is obtained using a large-scale language model from natural utterances. Then the information is encoded into Abstract Syntax Tree. Finally, Abstract Syntax Tree structure is decoded in order to generate the Cascading Styles Sheets rules. In order to measure the performance of the proposed method an experimental procedure is constructed. Using pre-trained transformers and generated training data for Cascading Styles Sheets rules, different tests are applied to different datasets and the accuracies are obtained. Promising results for Cascading Styles Sheets code generation tasks using structural and natural prompt design are achieved. 46.98% and 66.07% overall accuracies are obtained for structural and natural prompt designs, respectively. AU - Alaçam, Umut Can AU - Gökgöz, Çağla AU - Perkgoz, Cahit PY - 2022 JO - Journal of scientific reports-A (Online) VL - 0 IS - 049 SN - 2687-6167 SP - 49 EP - 61 DB - TRDizin UR - http://search/yayin/detay/534144 ER -