Project 1

Using Python and ML Techniques to Analyze GPT-2        2021

GPT-2, which stands for Generative Pre-trained Transformer 2, is an openAI - a large-scale unsupervised language model - that generates paragraphs of texts, performs reading comprehension, machine translation, question answering and summarization, all without
task-specific training.

I thought it’d be interesting to perform analysis on an already-existing machine learning tool. I performed machine learning techniques on text generated by GPT-2 to look for patterns that may emerge in how the artificial intelligence works and expose any biases it may have.

The Turing Test examines whether a machine can possess human consciousness. It involves an interviewer, a participant and a machine. The interviewer asks questions for five minutes and has to distinguish between the human and the machine. If the interviewer cannot do so, the machine is considered to be demonstrating human intelligence. The test results do not depend on the machine's ability to give correct answers to questions, but rather how closely its responses resemble to those a human would give. As of now, no machines have passed the test.

Questions like “can AI can possess human mindedness?” and “can AI take over the world?” have been looming over our minds ever since this technology emerged, and being able to analyse an open-sourced AI and presenting the resulting information as an interface like the one below allows anyone to obtain insight on how it functions and whether it has the potential to pass the test.







Process

Coming soon.



Reflection

Coming soon.