I work with windows 7 (my system: "LC_COLLATE=French_France.1252) with data with accents.
My data are coded in ANSI which allows me to visualize them correctly in the tabs of Rstudio.
My problem: When I want to a create GoogleVis page (encoding utf-8), the accented characters are not displayed correctly.
What I expected: I am looking to convert my latin1 Data.frames in utf-8 with R just before creating googleVis pages. I have no ideas. Stringi package seems only to work with raw data.
fr <- data.frame(âge = c(15,20), prénom = c("Adélia", "Adão"), row.names = c("I1", "I2"))
print (fr)
library (googleVis)
test <- gvisTable(fr)
plot(fr)
the real data
https://drive.google.com/open?id=0B91cr4hfMXV4OEkzWk1aWlhvR0E

# importing (historical data)
test_ansi<-read.table("databig_ansi.csv",
header=TRUE, sep=",",
na.strings="",
quote = "\"",
dec=".")
# subsetting
library (dplyr)
test_ansi <-
test_ansi %>%
count(ownera)
# library (stringi)
stri_enc_detect(test_ansi$ownera)
# visualisation
library (googleVis)
testvis <- gvisTable(test_ansi)
plot(testvis)
There are built-in functions in several packages, such as
stringi,stringr,SoundexBR,tau, as well as a character convert in the R base system, which can be used as:text2 <- iconv(text, from = "latin1", to = "UTF-8")You may also want a more specific function with some checks for factors, like the following:
Then you just use it as:
Extra edits after extra information and data
This is how my R is setup. My R is running on UTF-8 locale and English by default. Once my system environment differs from the file encoding provided, I'll use
fileEncoding = "LATIN1". That is all.Link to the table created