TwoDArray/Tokenizer help

My project is to basically read a long string of characters and grab tokens of each word. After I do that, I put each word in an array and find the frequency. What I need help is how I can put each word into a TWO-D array. I am able to cout every word separately, but I'm not sure how I can put it into an array continuously.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#include <iostream>
#include <string>
#include <vector>

int main()
{
    std::string token ; // 1D array of characters
    // http://www.mochima.com/tutorials/strings.html

    std::vector<std::string> array ; // 1D array of strings; 1D array of 1D array of characters == 2D array
    // http://www.mochima.com/tutorials/vectors.html

    while( std::cin >> token ) array.push_back(token) ; // put each token into the TWO-D array.

    // TODO: calculate frequencies and print
    // hint: std::sort // http://en.cppreference.com/w/cpp/algorithm/sort
}
is it possible to do this project with only cstring/iostream? my teacher has not introduced vectors
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
#include <iostream>
#include <cstring>
#include <iomanip>

int main()
{
    const std::size_t MAX_TOKEN_SIZE = 51 ; // 50 characters + 1 null character at the end
    const std::size_t MAX_TOKENS = 1000 ;

    char all_tokens[MAX_TOKENS][MAX_TOKEN_SIZE] = {{}} ; // TWO-D array
    std::size_t num_tokens = 0 ; // number of tokens actually read

    // read tokens (at most MAX_TOKEN_SIZE characters) one by one and place them into all_tokens till
    while( num_tokens <  MAX_TOKENS && // either MAX_TOKENS tokens have been read
           std::cin >> std::setw(MAX_TOKEN_SIZE) >> all_tokens[num_tokens] ) // or input fails
               ++num_tokens ;

    // std::cout << num_tokens << " tokens were read\n" ;
    // for( std::size_t i = 0 ; i < num_tokens ; ++i ) std::cout << all_tokens[i] << '\n' ;

    // TODO: calculate frequencies and print
    // hint: std::sort // http://en.cppreference.com/w/cpp/algorithm/sort
}
Topic archived. No new replies allowed.