Re: blessing db data as utf8
[prev]
[thread]
[next]
[Date index for 2004/06/10]
On Wed, Jun 09, 2004 at 11:28:59PM +0300, Gaal Yahas wrote:
> On Wed, Jun 09, 2004 at 01:23:49PM +0200, Andreas Fromm wrote:
> > > The problem is that the database doesn't know better: as far as it is
> > > concerned, the data is (say) latin1. This is true at least for mysql 4.0.20
> > > which I have been using. This means that the metadata about encoding type
> > > of a table or a column can't come from the database, even though ideally
> > > it should. I don't know DBIx::ContextualFetch to say, but DBI seems at the
> > > moment to be too low-level for this kind of knowledge.
> > >
> > > That said, *my* data is all utf8, so I don't mind a global switch :)
> > >
> > What abaut PostgerSQL where you tell the server at database-creation how
> > to encode the Data? When I create a db with unicode encoding, it
> > _should_ know abaut encoding, doesn't it?
>
> I just wanted to post an update on this issue. Following a tip from
> Dominic Mitchell <dom@xxxxxxxxx.xxx>, who did this for Pg, I have
> patched DBD::mysql to support utf8 data.
>
> http://lists.mysql.com/perl/3006
>
> Hopefully, this will hit CPAN soon.
>
> Thanks everybody for the comments, and especially Dominic from whom I
> borrowed some code!
This isn't a good way to check for utf8:
+int is_high_bit_set(char *val) {
+ while (*val++)
+ if (*val & 0x80) return 1;
+ return 0;
+}
because it make it hard for any latin-1 data to coexist.
The perl guts probably has a function to check for well-formed utf8
and that should be used instead.
Tim.
|
(message missing)
|