------------------------------------------------------------
-= 1.2 =-

For the release notes: 
Cabal 1.1.1 cannot read Cabal 1.0's .setup-config files. Users will
need to re-run the configure command for all their projects after
upgrading from Cabal 1.0. Otherwise, they will get:
C:\software\cabal>runghc Setup.lhs build
Setup.lhs: error reading ./.setup-config; run "setup configure" command?

It is not clear how to build an executable whose Main module is
preprocessed. The following does not work:
Executable: Foo
Main-is: Main.hsc

Cabal does not handle dependencies for HSC2HS correctly. For example,
if Foo.hsc has
     #include "x.h"
then Foo.hs should get regenerated whenever x.h is modified. However,
Cabal only regenerates Foo.hs when Foo.hsc has been modified. Please
let me know if you want a complete testcase.

- Brian

* Clarify build-depends; unix? posix? utils? see journal entry of aug
   22, 2005.

* look at "cabal design 

* REMOVE THESE FLAGS: add flags The patch adds a couple extra configure options,
  --enable-library-for-ghci / --disable-library-for-ghci or whatever
   they end up being, to the manual.  Which is default?  Add test case
   for this.

* Something for gentoo which produces a package gen file but doesn't
  do the register??

* Mine Brian Smith emails for tests & patches

* Move Distribution and everything
  into a subdirectory, so we can build the Setup file with the normal
  invocation of cabal?

* add a cabal-version field?

* ignore unknown fields? (--force?)

-= 1.0 =-
* new field  data-files, a list of files to be copied to a place where
>> >    an executable can find them (e.g. template-hsc.h for hsc2hs):
>> >         Hugs: the directory containing the Main module
>> >         GHC/Windows: the directory containing the executable
>> >         GHC/Unix: /usr/local/share/<exename>
>> >    plus a new function in System.Directory to return the name of this
>> >    directory.  That would address Dimitry's requirements in
>> 
>> How about allowing directories too, which would be copied recursively?

Also, if we do this, we should probably specify the manner in which
such a directory should be layed out so:

1) it doesn't get too cluttered
2) different packages don't stomp on each-other's files and
3) different versions of different packages can use the same filenames.

dataFileDir :: Distribution.Package.PackageIdentifier -> FilePath
dataFileDir ident = dataFileDirRoot `joinFilePath` (showPackageId ident)

which I like better, but that strongly couples "dataFileDir" to the
Cabal package in that you need to have a PackageIdentifier.  How do
you get that PackageIdentifier?  Well, your program will have to parse
your .cabal file.  No problem! (if you have one)

* New field extra-tmp-files, a list of extra files to be removed by
   setup clean, beyond those that can be deduced.

* Rename other-files as extra-source-files for consistency and clarity.

* Install libraries in $libdir/ghc-$ghc_version/ rather than $libdir.

* do something with stub files generated by ghc?

* Decide on interface
** Which fields are required, which targets required.
** which Distribution.* things won't change?
** document

* Fix up sdist? hide sdist? bdist?
** if there's a flag, --include-preprocessed-sources (or something
   better) run the preprocessing phase and include both the
   unpreprocessed and the preprocessed sources in the source tarball?
But really, there are two kinds of preprocessors, as Ross points out.
The kind that produce OS-independent code, and the kind that produce
OS-dependent code.  Perhaps this concept shoudl be added to the
PreProcessor type, and a we could have two flags to sdist:

--include-standalone-preprocessed-sources

Which would generate the OS-independent sources from tools like Alex
and Happy... 

--include-all-preprocessed-sources

Which just includes all of the preprocessed sources as above.

A downside to this is in how it interacts with another proposal to add
tool dependencies.  If a package tool-depends on "alex", and then a
source tarball is created with
--include-standalone-preprocessed-sources, then it actually no longer
tool-depends on alex, so we should regenerate the .cabal file.  I
guess that's no big deal.

** Better way to find 'tar'; is there a library? what does darcs do?

* do we have to run configure before clean?

* Preprocessors
** chain of preprocessors
** what other preprocessors can't unlit?

* Hugs - look for "FIX (HUGS)"

* Haddock
  - should process hidden modules as well as exposed ones.  The hidden
    modules might contain entities that are re-exported by an exposed
    module.  Hidden modules should use the #hide haddock directive.

  - if GHC is present and hscpp is not, we can use 'ghc -E -cpp'.  This
    also unlits.

  - haddock should be passed the names of the interface files for the
    dependent packages (gotten from haddock_interfaces field of the
    dependent packages, query ghc-pkg).

  - we should install the haddock interface, and fill in the location
    in haddock_interfaces.  Similarly for the HTML, and haddock_html.

* grep for "FIX"

* Parsing
** Allow quoting in the options fields, to allow things like
  -f"something with spaces"
** Instead of freaking out on unknown fields, the parser should return
   a list of those unknown fields so a warning can be printed. Or not.

* Doc
** do comments have to start in the first column?
** clarify relationship between other-modules and modules, etc.
** add preprocessor explanation (see bottom of this TODO).
** Fix example for angela, expose Data.Set, etc, not A, B, etc.b
** add information about executable stanzas
** elimintate need for cpphs in haddock makefile rule.
** add info about deb packages to web page at least check out the
   manpage for dh_haskell, section "How to package a haskell library"

* Misc
** HC-PKG (see "Depends on HC-PKG" below)
** add more layered tools to appendix?
** make reference to "layered tools" appendix where approprote
** integrate hscpp, use it for preprocessing step.
** SDist for windows machines, or machines without tar.
** add sanity checking command?

* testing
** find a real test case that uses preprocessors
** add a make target or command for tests we know will fail?
** setup test suite to run on --push?
** redirect non-hunit outputs to a file?
** test / port code for Hugs
** error cases for parsing command-line args
** reading & writing configuration-dropping
** use-cases based on SimonPJ's doc
** discovering the location of the given flavor of compiler and pkg tool

------------------------------------------------------------
-= Future Releases =-

* Depends on HC-PKG
** hugs-pkg
** register for hugs
** configure: check for presence of build dependencies

* NHC Support
** look carefully at "rawSystem" and error handing stuff for nhc.
** add install target for nhc
** add information for compiling w/ nhc
** nhc-pkg (see old package manager code)
** register

* Hugs
- no way to tell Hugs to turn packages on or off
- no register / unregister for hugs

* Misc
** ./Setup.lhs bdist
** Reorganize compiler dependent code into Distribution.Compiler.*
** API Versioning? Libtool-style or just a major number?
** Extensions
- complain if their use makes the code non-portable?
-- but what does this mean? ghc & hugs?

** "collections / distributions, etc" multiple cabal packages in one package
** It would be useful to have alternatives in dependencies, e.g. HGL
   could depend on X11 | Win32.

** sanity checking tool for configuration; are all the .hs files
   included, etc.

** create a (native?) zlib library?

** sign flag?

** for fields like allModules, allow user to specify "Foo.Bar.*" or
   something to indicate all haskell modules under that?

** Get function from hmake that creates a directory based on arch.

** ./Setup test
- this may be something that's easy to break off and give to someone
   else.
- give to John Goerzen?

** writePersistBuildConfig robustify + diagnostics
** elaborate command-line help text
** configure should check for 'ar' args + properties (see fptools/aclocal.m4)
** most commands should accept a -v flag to show command lines?
** configure should check version of compiler

** hat support
** per-system source database
** rebuild for new compiler
** helium
** hbc


------------------------------------------------------------

* Orthogonal (layered?) tools

** visual studio support

** hackage

** downloadable public database of packages (wget filename;tar xf
   filename;cd filename;./setup install)

   NOTE: such an interface might be implemented w/ xml-rpc, which is
   there for Haskell now, though in general we'll probabliy want to be
   careful here about dependencies.

** debian package building (boilerplate) tool.  Other debian support
   w/ rebuild-all-packages?

------------------------------------------------------------
[1] Foo.y is a happy grammer which, when processed, will produce Foo.hs.

The description file should include the module Foo.

./setup sdist (source distribution): Include Foo.y, not Foo.hs.  Maybe
we could add a flag to include Foo.hs as well.  This makes sense for
some preprocessors and not for others, but I'm wary of including too
much preprocessor-specific behavior.

./setup clean: Removes Foo.hs if Foo.y exists.

./setup build: Preprocesses Foo.y to Create Foo.hs before any
compilation.

The issue with cpp is that we can't go by extensions as we do with the
rest of the preprocessors... There is a function in HMake which tests
to see if a file needs to be cpp'd, so we can employ that.  I think
we'll probably have to just treat cpp a little differently from the
others, unfortunitely, and I haven't gotten around to it.
