This page is prepared for Hadoop Core committers. You need committer rights to create a new Hadoop Core release.
DEPRECATED! This doc refers to the releases already end of life. For current releases, please see HowToRelease.
These instructions have been updated for Hadoop 0.20.x and 1.x releases. For earlier releases, check out an older revision of this page. For 0.21.0 and later many of the steps need to be done in turn for Common, HDFS, and MapReduce. For 0.23.x and 2.x releases, there is a new version of this page at HowToReleasePostMavenization.
If you have not already done so, update your @apache.org account via id.apache.org with your key; also add and commit your public key to the Hadoop repository KEYS, appending the output of the following commands:
gpg --armor --fingerprint --list-sigs <keyid> gpg --armor --export <keyid> |
and publish your key at http://pgp.mit.edu/. Once you commit your changes, log into people.apache.org
and pull updates to /www/www.apache.org/dist/hadoop/core
. For more details on signing releases, see Signing Releases and Step-By-Step Guide to Mirroring Releases.
To deploy artifacts to the Apache Maven repository create ~/.m2/settings.xml
:
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd"> <servers> <server> <id>apache.staging.https</id> <username>Apache username</username> <password>Apache password</password> </server> </servers> </settings> |
Skip this section if this is NOT the first release in a series (i.e. release X.Y.0).
CHANGES.txt
to include the release version and date (use Unreleased
for the date if it is unknown) and remove Trunk (unreleased changes)
.Commit these changes to trunk.
svn commit -m "Preparing for release X.Y.Z" |
Create a branch for the release series:
svn copy https://svn.apache.org/repos/asf/hadoop/common/trunk \ https://svn.apache.org/repos/asf/hadoop/common/branches/branch-X.Y -m "Branching for X.Y releases" |
CHANGES.txt
to add back in Trunk (unreleased changes)
.build.xml
on trunk to X.Y+1.0-dev.hadoop.version
number in ivy/libraries.properties
on trunk to X.Y+1.0.Commit these changes to trunk.
svn commit -m "Preparing for X.Y+1.0 development" |
These operations take place in the release branch.
Check out the branch with:
svn co https://svn.apache.org/repos/asf/hadoop/common/branches/branch-X.Y |
CHANGES.txt
to include the release version and date (this change must be committed to trunk and any intermediate branches between trunk and the branch being released).Update src/docs/releasenotes.html
with release notes for this release. You generate these with:
cd src/docs jira.sh -s https://issues.apache.org/jira -u $user -p $pw \ -a getIssueList --search \ "project in (HADOOP,HDFS,MAPREDUCE) and fixVersion = '$vers' and (resolution = Fixed OR 'Target Version/s' = '$vers') ORDER BY KEY" \ | ./relnotes.py > $vers.html |
edit the releasenotes.html with the list of items from $vers.html.
build.xml
to be hadoop-X.Y.N-dev, where N is one greater than the release being made.hadoop.version
number in ivy/libraries.properties
to be the same as the release being made.Commit these changes.
svn commit -m "Preparing for release X.Y.Z" |
If not already done, merge desired patches from trunk into the branch and commit these changes. You can find the revision numbers using svn log CHANGES.txt
in the branch and in trunk.
cd branch-X.Y svn merge -rR1:R2 ../trunk . svn commit -m "Merge -r R1:R2 from trunk to X.Y branch. Fixes: HADOOP-A, HADOOP-B." |
Tag the release candidate (R is the release candidate number, and starts from 0):
svn copy https://svn.apache.org/repos/asf/hadoop/common/branches/branch-X.Y \ https://svn.apache.org/repos/asf/hadoop/common/tags/release-X.Y.Z-rcR -m "Hadoop X.Y.Z-rcR release." |
To build an official release, you must:
HADOOP-6846 has some scripts that make it easier to build and smoke test a release for 0.21.0 and later.
Build the release & run unit tests. This is captured in part in http://svn.apache.org/viewvc/hadoop/nightly/hudsonBuildHadoopRelease.sh?view=markup. The three parts of this command are intended to be run cumulatively:
## build 32-bit export JAVA_HOME=/path/to/32bit/jdk export CFLAGS=-m32 export CXXFLAGS=-m32 ant \ -Dforrest.home=/usr/local/forrest \ -Djava5.home=/usr/local/jdk1.5 \ -Dfindbugs.home=/usr/local/findbugs \ -Declipse.home=/usr/lib/eclipse \ -Dxercescroot=/usr/local/xerces-c \ -Dversion=X.Y.Z \ -Dhadoop.version=X.Y.Z \ -Dcompile.native=true \ -Dcompile.c++=true \ -Dlibhdfs=true \ -Dlibrecordio=true \ -Dtest.junit.output.format=xml \ veryclean task-controller rpm deb \ | tee build_log_dir/build32-X.Y.Z.log ## build 64-bit export JAVA_HOME=/path/to/64bit/jdk export CFLAGS=-m64 export CXXFLAGS=-m64 ant \ -Dforrest.home=/usr/local/forrest \ -Djava5.home=/usr/local/jdk1.5 \ -Dfindbugs.home=/usr/local/findbugs \ -Dversion=X.Y.Z \ -Dhadoop.version=X.Y.Z \ -Dcompile.native=true \ -Dcompile.c++=true \ -Dlibhdfs=true \ -Dlibrecordio=true \ tar rpm deb | tee build_log_dir/build64-X.Y.Z.log ## run tests (back in 32-bit mode) export JAVA_HOME=/path/to/32bit/jdk export CFLAGS=-m32 export CXXFLAGS=-m32 ant \ -Dforrest.home=/usr/local/forrest \ -Djava5.home=/usr/local/jdk1.5 \ -Dfindbugs.home=/usr/local/findbugs \ -Declipse.home=/usr/lib/eclipse \ -Dxercescroot=/usr/local/xerces-c \ -Dversion=X.Y.Z \ -Dhadoop.version=X.Y.Z \ -Dcompile.native=true \ -Dcompile.c++=true \ -Dlibhdfs=true \ -Dlibrecordio=true \ -Dtest.junit.output.format=xml \ test test-c++-libhdfs | tee build_log_dir/build32tests-X.Y.Z.log |
Generate the checksums of the release file.
gpg --print-mds hadoop-X.Y.Z.tar.gz > hadoop-X.Y.Z.tar.gz.mds |
Sign the release
gpg --armor --output hadoop-X.Y.Z.tar.gz.asc --detach-sig hadoop-X.Y.Z.tar.gz |
Copy release files to a public place.
ssh people.apache.org mkdir public_html/hadoop-X.Y.Z-candidate-0 scp -p hadoop-X.Y.Z.tar.gz* people.apache.org:public_html/hadoop-X.Y.Z-candidate-0 |
Stage the release candidate to the maven repository:
ant \ -Dforrest.home=/usr/local/forrest \ -Djava5.home=/usr/local/jdk1.5 \ -Dfindbugs.home=/usr/local/findbugs \ -Dversion=X.Y.Z \ -Dhadoop.version=X.Y.Z \ -Drepo=staging \ mvn-deploy ## Be ready to respond to the interactive request for your GPG pass-phrase, for signing the artifacts. |
For Hadoop-2.x use maven deploy
mvn -Psign deploy -DskipTests ## Be ready to respond to the interactive request for your GPG pass-phrase, for signing the artifacts. |
Log In
in the upper right corner. Log in using your apache user name and password.Staging Repositories
.Close
button above the Repository names. This makes your release candidate available at the Staging level.Drop
the old one now.In 7 days if the release vote passes, the release may be published.
Tag the release:
svn move https://svn.apache.org/repos/asf/hadoop/common/tags/release-X.Y.Z-rcR \ https://svn.apache.org/repos/asf/hadoop/common/tags/release-X.Y.Z -m "Hadoop X.Y.Z release." |
Copy release files to the distribution directory and make them writable by the hadoop group.
ssh people.apache.org cp -pr public_html/hadoop-X.Y.Z-candidate-0 /www/www.apache.org/dist/hadoop/core/hadoop-X.Y.Z cd /www/www.apache.org/dist/hadoop/core chgrp -R hadoop hadoop-X.Y.Z chmod -R g+w hadoop-X.Y.Z |
The release directory usually contains just two releases, the most recent from two branches, with a link named 'stable' to the most recent recommended version.
ssh people.apache.org cd /www/www.apache.org/dist/hadoop/core rm -rf hadoop-A.B.C; rm stable ln -s hadoop-A.B.D stable |
Release
Prepare to edit the website.
svn co https://svn.apache.org/repos/asf/hadoop/common/site ~/hadoop-site |
Update the documentation links in
main/author/src/documentation/content/xdocs/site.xml |
.
Update the release news in
main/author/src/documentation/content/xdocs/releases.xml |
.
Regenerate the site, review it, then commit it.
cd ~/hadoop-site/main ant -Dforrest.home=/usr/local/forrest -Djava5.home=/usr/local/jdk1.5 update svn commit -m "Updated site for release X.Y.Z." |
It is not usually necessary to update the site front page (http://hadoop.apache.org), but if it is needed, update main/author/src/documentation/content/xdocs/index.xml
, then do
cd ~/hadoop-site/main ant -Dforrest.home=/usr/local/forrest -Djava5.home=/usr/local/jdk1.5 update svn commit -m "Updated site front page for release X.Y.Z." |
Publish the new release docs, by doing the following:
ssh people.apache.org svn co --depth immediates https://svn.apache.org/repos/asf/hadoop/common/site/main/publish/docs/ cd docs tar xzf /www/www.apache.org/dist/hadoop/core/hadoop-X.Y.Z/hadoop-X.Y.Z.tar.gz --wildcards '*/docs' mv hadoop-X.Y.Z/docs rX.Y.Z svn add rX.Y.Z svn commit -m "Publishing docs for release X.Y.Z." rm -r hadoop-X.Y.Z |
If the docs/current
and/or docs/stable
links should be updated to the new release, do one or both of the following:
## update current rm current ln -s rX.Y.Z current svn commit -m "Updating link to current docs." ## update stable rm stable ln -s rX.Y.Z stable svn commit -m "Updating link to stable docs." |
Generate the jdiff API data for the new release by, in the branch directory, running
ant -Dversion=X.Y.Z api-xml |
then commit the new XML file generated in lib/jdiff to both trunk and to the branch (and any intermediate branches between trunk and the branch being released).
svn add lib/jdiff/hadoop_X.Y.Z.xml svn commit -m "JDiff output for release X.Y.Z" |
jdiff.stable
value in the X.Y+1 branch's build.xml
(which may be trunk) to be the published release (ie. X.Y.Z).