Haskell executable linking with static library written in C++ got `undefined reference`

2k views Asked by At

I've create a static library:

// foo.h
extern "C" {
int foo (const char* arg0, int arg1);
}

// foo.cpp
#include "foo.h"
// implementation of foo

This block of code was compiled to foo.o and packaged into libfoo.a which was installed to MinGW's lib dir (I'm on Windows, using GCC toolchain).

What I want to do is to wrap this function in Haksell code, so a typical FFI binding as follows:

-- Foo.hs
{-# LANGUAGE ForeignFunctionInterface #-}
module Foo where

foreign import ccall "foo"
    c_foo :: CString -> CInt -> IO (CInt)

extra-libraries was added to the .cabal file as well:

...
extra-libraries: foo, stdc++

But GHC compains about undefined reference on foo:

.dist-scion\build\path\to\Foo.o:fake:(.text+0x514): undefined reference to `foo'

After nm the library to find out the function foo actually existing in the library (with some decorations on the name), I'm stucked here...


[EDIT]
I also tried build the haskell package using cabal:

cabal configure
cabal build

The result shows:

Loading object (dynamic) foo ... ghc.exe: foo: ....
<command line>: user specified .o/.so/.DLL could not be loaded (addDLL: could not load DLL)

So is this supposed to have something to do with static/dynamic linking? 'cause I noticed GHC want to load .o/.so/.DLL but NOT .a. I'm really confused.


finally got something on the wiki: Cxx Foreign Function Interface


[EDIT]
One solution is to use -optl-lfoo -optl-lstdc++ in .cabal file, not extra-libraries. And the naming problem can be easily solved by wrapping the declaration in extern "C":

#ifdef __cplusplus
extern "C" {
#endif

extern int foo (const char*, int);

#ifdef __cplusplus
}
#endif

This works within EclipseFP, 'cause it uses Scion. But it still fails on cabal build.

0

There are 0 answers