Fall 2017 CS40 Foundations of Computer Science

From courses
Revision as of 11:14, 17 August 2017 by Wangwilliamyang (talk | contribs) (Course Objective)

Jump to: navigation, search

Instructor and Venue

  • Instructor: William Wang
  • TAs: TBD
  • Grader: TBD
  • Time: M W 2:00am - 3:15pm
  • Location: CHEM 1171
  • Discussions: M 4:00pm - 4:50pm GIRV 1112, M 5:00pm - 5:50pm GIRV 1115, M 6:00pm - 6:50pm PHELP3519
  • TA Office Hours: TBD
  • Instructor Office Hours: M 4:30-5:30 HFH 1115
  • Prerequisites: Computer Science 16 with a grade of C or better and Mathematics 4A with a grade of C or better.

Course Objective

  • Course goals
  • To learn and be able to apply basic knowledge of:
    • Logic - propositional logic, predicate logic, rules of inference
    • Proofs - proof methods and strategies
    • Sets and functions - sets, set operations and functions, function types
    • Algorithms and complexity
    • Relations - representing relations and their properties; orderings
    • Recursion and induction
    • Counting - counting methods, permutations and combinations
    • Number theory - divisibility and modular arithmetic

As an instructor, my goals are to help you to become more of a self-directed learner, i.e. to learn how to learn on your own. Self-directed learning, like any skill, takes practice.

  • What you will learn

By the end of the quarter, you should understand the basic concepts of logic, proofs, sets, functions, algorithms, proof techniques, counting, and relations - and have some insight into how these relate to more advanced topics in computer science. You will gain experience, both conceptually and practically, by homework assignments that involve solving problems and implementing learned concepts and techniques.

Piazza

Please enroll if you haven't: piazza.com/ucsb/winter2017/cs190i

Syllabus

  • 01/10 Introduction & logistics, and NLP applications
  • 01/12 Basic text processing
  • 01/17 N-grams & language models & HW1 out
  • 01/19 Text classification: naive Bayes
  • 01/24 Voted perceptron and logistic regression
  • 01/26 Part-of-speech tagging: HMMs
  • 01/31 HMMs and MEMMs & HW1 due / HW2 out
  • 02/02 Conditional Random Fields
  • 02/07 In-class midterm exam
  • 02/09 Natural language parsing
  • 02/14 Word sense disambiguation & HW2 due (extended to 02/15 midnight) / HW3 out
  • 02/16 Guest lecture on Social Media: Vivek Kulkarni (Stony Brook)
  • 02/21 Distributional semantics (1): sparse representation
  • 02/23 Distributional semantics (2): dense representation
  • 02/28 Machine translation & HW3 due / HW4 out
  • 03/02 Question answering
  • 03/07 Deep learning for NLP: RNNs
  • 03/09 Deep learning for NLP: CNNs
  • 03/14 Course review and class evaluation & HW4 due
  • 03/16 In-class final exam

Course Description

Have you ever used intelligent virtual assistants such as Google Now, Apple Siri, Amazon Alexa or Microsoft Cortana? Wondering what are the technologies behind such systems? How did IBM's Watson beat top human Jeopardy players? Or you are just curious about how Google Translate works? Understanding human language is an important goal for Artificial Intelligence, and this course introduces fundamental theories and practical applications in Natural Language Processing (NLP). In particular, this course will focus on the design of basic machine learning algorithms (e.g., classification and structured prediction) for core NLP problems. The concentration of this course is on the mathematical, statistical and computational foundations for NLP.

Throughout the course, we will cover classic lexical, syntactic, and semantic processing topics in NLP, including language modeling, sentiment analysis, part-of-speech tagging, parsing, word sense disambiguation, distributional semantics, question answering, information extraction, and machine translation. The parallel theme on machine learning algorithms for NLP will focus on classic supervised learning, semi-supervised learning, and unsupervised learning models, including naive Bayes, logistic regression, hidden Markov models, maximum entropy Markov models, conditional random fields, feed-forward neural networks, recurrent neural networks, and convolutional neural networks. Throughout the course, we will study the implicit assumptions made in each of the machine learning models, and understand the pros and cons of these modern statistical tools for solving NLP problems. A key emphasis of this course is on empirical and statistical analysis of large text corpora, and distill useful structured knowledge from large collections of unstructured documents.

Text Book

No official text book is required for this class, but the following optional text book is recommended:

  • Speech and Language Processing (2nd ed.), Dan Jurafsky and James H. Martin.

The following website provides a free draft version of a new edition of this book. [1].

Grading

There will be four homework assignments (40%), one mid-term exam (20%), and a final exam (40%). Four late days are allowed with no penalty. After that 50% will be deducted if it is within 4 days after the due day, unless you have a note from the doctors' office. Homework assignment submissions that are five days late will receive zero credits. Your grade can be found on GauchoSpace.

Academic Integrity

We follow UCSB's academic integrity policy from UCSB Campus Regulations, Chapter VII:``Student Conduct and Discipline"):

  • It is expected that students attending the University of California understand and subscribe to the ideal of academic integrity, and are willing to bear individual responsibility for their work. Any work (written or otherwise) submitted to fulfill an academic requirement must represent a student’s original work. Any act of academic dishonesty, such as cheating or plagiarism, will subject a person to University disciplinary action. Using or attempting to use materials, information, study aids, or commercial “research” services not authorized by the instructor of the course constitutes cheating. Representing the words, ideas, or concepts of another person without appropriate attribution is plagiarism. Whenever another person’s written work is utilized, whether it be a single phrase or longer, quotation marks must be used and sources cited. Paraphrasing another’s work, i.e., borrowing the ideas or concepts and putting them into one’s “own” words, must also be acknowledged. Although a person’s state of mind and intention will be considered in determining the University response to an act of academic dishonesty, this in no way lessens the responsibility of the student.

More specifically, we follow Stefano Tessaro and William Cohen's policy in this class:

You cannot copy the code or answers to homework questions or exams from your classmates or from other sources; You may discuss course materials and assignments with your classmate, but you cannot write anything down. You must write down the answers / code independently. The presence or absence of any form of help or collaboration, whether given or received, must be explicitly stated and disclosed in full by all involved, on the first page of their assignment. Specifically, each assignment solution must start by answering the following questions:

  • (1) Did you receive any help whatsoever from anyone in solving this assignment? Yes / No.
    • If you answered 'yes', give full details: (e.g. ``Jane explained to me what is asked in Question 3.4")
  • (2) Did you give any help whatsoever to anyone in solving this assignment? Yes / No.
    • If you answered 'yes', give full details: (e.g. ``I pointed Joe to section 2.3 to help him with Question 2".
  • No electronics are allowed during exams, but you may prepare an A4-sized note and bring it the exam.
  • If you have questions, often ask the teaching staff.

Academic dishonesty will be reported to the highest line of command at UCSB. Students who engage in such activities will receive an F grade automatically.

Accessibility

Students with documented disability are asked to contact the DSP office to arrange the necessary academic accommodations.

Discussions

All discussions and questions should be posted on our course Piazza site.