The char type stores a single character. You wrap the character in single quotes:
char grade = 'A';
char initial = 'J';
char symbol = '@';
Notice the single quotes. Double quotes create a String, not a char. Writing char letter = "A"; causes a compile error. This is a common early mistake.
Under the hood, char stores a -bit Unicode number. The letter 'A' is stored as the number . This means you can do arithmetic with characters. 'A' + 1 gives , which is the code for 'B'. You won't use char as often as int or String, but it shows up in string manipulation and character-by-character processing.